> The proven fact that the said technology has failed its purpose,
How, because other solutions are being explored? That's not due to a failure of one thing, but because both defense in depth and a desire to fix existing systems with no additional engineering are paths that security researchers and vendors explore. Not everyone will converge on a single solution, even when that solution is practical.
Just because something is not being used universally doesn't mean that it has failed. Moreso, it is not widely known about, and there persists rumors that it requires extraordinary effort, often reinforced by well meaning, but rather outdated advice.
> desire to fix existing systems with no additional engineering
I, too, enjoy sci-fi.
> Just because something is not being used universally doesn't mean that it has failed.
You are only correct in the dictionary sense of these words. Fact is that a lot of the programmers are vain creatures prone to ego, and they make their chosen technical stack part of their core identity. This prevents them from being flexible, they get rigid as they age and they become part of the problems they so passionately wanted to fix when they were young.
None of that is made easier by the managerial class that absolutely loves and financially stimulates the programmers who don't want to rock the boat.
So I'd say if the said CBMC, and likely other tools in the same area, has more or less failed if it could not convince a critical mass of C/C++ devs to use it and finally start keeping up with Rust (and the other languages @pjmlp mentioned).
> Moreso, it is not widely known about, and there persists rumors that it requires extraordinary effort, often reinforced by well meaning, but rather outdated advice.
The victims of Heartbleed and many other CVEs don't care. The breaches happened anyway.
I am amazed at your desire to downplay the problem and keep claiming that eventually stuff will work out.
I disagree. And I'll repeat a very core part of my argument: C/C++ devs were handed a monopoly in their areas for decades and they still can't arrive at a set of common techniques that reduce or eliminate memory safety bugs.
I am not impressed. And I am not even a particularly good programmer. Just a diligent guy with average programming ability whose only unique trait is that he refuses to accept the status quo and always looks at how can stuff be improved. But this has taken me a long way.
I was characterizing these hardware changes as being fantasy, so I'm glad you agree.
> So I'd say if the said CBMC, and likely other tools in the same area, has more or less failed if it could not convince a critical mass of C/C++ devs to use it
So, in the same vein, Rust has failed because it has only been around for a similar amount of time and people still use C/C++?
> The victims of Heartbleed and many other CVEs don't care. The breaches happened anyway.
I fail to see how a CVE that occurred due to poor engineering practices has anything to do with the adoption of good engineering practices and tooling. Yes, Heartbleed is why we need this tooling.
You are simultaneously arguing that if we could just adopt Rust, our problems would be solved, but since another technology has not yet been adopted, it has failed. Rust isn't adopted due to programmer ego, but the use of tooling that does the same thing as Rust and more has not yet been adopted because it has failed. Do you not see the logical inconsistency in your position?
> So, in the same vein, Rust has failed because it has only been around for a similar amount of time and people still use C/C++?
Yes, it kind of failed there indeed. And I even hinted at why: Rust is far from perfect and its async implementation is a cobbled together mess. Golang's model reads much better, though I hate their foot-guns quite a lot (like writing to a closed channel leads to a panic; who thought that was a good idea?).
> I fail to see how a CVE that occurred due to poor engineering practices has anything to do with the adoption of good engineering practices and tooling. Yes, Heartbleed is why we need this tooling.
You can't see it? But... the good practices do lead to less of these CVEs as you yourself seem to realize? I don't get this part of your comment.
> You are simultaneously arguing that if we could just adopt Rust, our problems would be solved, but since another technology has not yet been adopted, it has failed.
You have answered it yourself: a lot of people see manual wrangling of `void**` as a badge of honor and their ego takes over (and the fear of being displaced, of course). I claim that Rust is not being more widely adopted due to programmer ego and fear of being obsolete. The fear of the end of nice salaries because they belong to a diminishing cohort of old-school cowboys.
Who would not fear that? Who would want that to end?
> Do you not see the logical inconsistency in your position?
No, and I don't get your argument. The reasons for C/C++ devs not improving the memory safety of their code, and the reasons for them not adopting Rust are very different. Not only is the analogy bad, it is plain inapplicable.
---
But it also does not help that HN reacts like a virgin schoolgirl pinched on the arse when Rust is mentioned. I've coded it for a few years, I loved it, I hated the bad parts and called them out, but even to this day I very quickly and easily get branded as a Rust fanboy even if my comment history shows balanced criticisms towards it. People don't care. People are emotional and are quick to put you in a camp that's easy to hate.
That is the part that I truly hate. No objective debate.
Too expensive to move to Rust? GOOD! That's an amazing argument, we can talk that for weeks and get very interesting insights in both directions.
People unwilling to get re-trained? Also a good argument, with big potential for interesting insights!
But most of everything else is at the level of a heated table debate after the 11th beer. Pretty meh and very uninteresting. No idea why I keep engaging, I think I am just bitter that people who REALLY should know better are reacting on emotion and not on merit. But that's on me. We all have our intolerances to the reality we inhabit. This is one of mine.
That's a rather cynical interpretation of these initiatives. CHERI, for instance, has been in development for twenty years. It predates the general availability of open source tools like CBMC or languages like Rust. But, that doesn't make the concept better or obsolete. It makes it complementary.
Hardware security is complementary to software security. Mitigations at the hardware level, the hypervisor level, and the operation system level complement architectural, process, and tooling decisions made at the software level.
Defense in depth is a good thing. There can always be errors in one layer or another, regardless of software solution, operating system, hypervisor, or hardware. I can wax poetic about current CPU vulnerabilities that must be managed in firmware or operating systems.
Many of the issues caused by C, are solved by Modula-2, Object Pascal and Ada, we didn't need to wait for Rust. But those aren't the languages that come for free with UNIX.
Or even better, they would be solved by C itself, if WG 14 cared even a little about providing proper support for slices, proper arrays and proper string types, or even as library vocabulary types.
But what to expect, when even Dennis Ritchie wasn't able to get his approach to slices being worked on by WG 14.
So hardware memory tagging, and sandboxed enclaves it is.
There is nothing wrong with defense in depth. But, this is not where things stop.
I make extensive use of bounded model checking in my C development. I also use privilege separation, serialization between separate processes, process isolation, and sandboxing. That's not because bounded model checking has somehow failed, but because humans are fallible. I can formally verify the code I write, but unless I'm running bare metal firmware, I also have to deal with an operating system and libraries that aren't under my direct control. These also have vulnerabilities.
That's not a trivial thing. The average software stack running on a server -- regardless of whether it is written in C, Rust, Modula-2, Pascal, Ada, or constructively proven Lean extracted to C++ -- still goes through tens of millions of lines of system software that is definitely NOT safe. All of that code is out of a developer's control for now. Admins can continually apply patches, but until those projects employ similar technology, they are themselves a risk.
One day, hopefully, all software and firmware will go through bounded model checking as a matter of course. Until then, we work with what we can, and we fix what we can. We can also rely on hardware mitigations where applicable. That's not failure as you have claimed, but practical reality.
> I make extensive use of bounded model checking in my C development...
I would absolutely love it if you were the majority, alas you are not.
I emulate exhaustive pattern matching in my main language of choice because it does not have it (it's not Rust or OCaml or Haskell) but because I saw how beneficial and useful it is. And sadly, many of the other devs using that language don't do so, and I have made a good buck going after them and fixing their mistakes.
I don't doubt your abilities as a person. I doubt the abilities of the corpus of C/C++ devs at large.
Well, that's something I hope to change. The tools required to write safer software exist. They just aren't widely distributed yet.
I can say, without ego, that I'm a reasonably good software developer. But, it is the tooling and process that I use that allows me to build safer software and that makes me a reasonably good developer. The same is true of Rust developers.
I can teach these skills to other developers, and in fact, I have plans to do so.
I don't expect things to change overnight, any more than I expect things to be rewritten in Rust overnight. C++ has been around for nearly 40 years, and software is still written in C. But, we can do better, and we must do better.
How, because other solutions are being explored? That's not due to a failure of one thing, but because both defense in depth and a desire to fix existing systems with no additional engineering are paths that security researchers and vendors explore. Not everyone will converge on a single solution, even when that solution is practical.
Just because something is not being used universally doesn't mean that it has failed. Moreso, it is not widely known about, and there persists rumors that it requires extraordinary effort, often reinforced by well meaning, but rather outdated advice.