Just because expert C programmers were able to make it work doesn't mean that C is the best way to write secure code. I say this as someone who writes quite a bit of C (though I don't view myself an expert in it yet).
I highly doubt they just drew C out of a hat and decided to make it work for their multi million dollar, "absolutely-cannot-fail-software". They would have considered other languages out there. C was the winner.
Speaking from high-assurance niche, C absolutely sucked for that in more ways than you can imagine. Like the recent NASA post, the reason they use C is tooling and available talent. Ridiculous amounts of manpower went into developing static analysis, testing, compilers, etc for C. The best talent in high assurance pretty much all know C, too, whether or not it's their default language. Almost all the hardware supports it, too.
So, a combo of top talent w/ a C history and tons of tooling made most of them go with C. Many others went with Ada/SPARK. A few small fish even use typed or structured assembler. C is only chosen due to tooling and "better the devil you know" philosophy. Work on superior languages got more results with less effort in both human and mechanical reviews, though. So, the inferior language is only used due to an accident of history.
Are you recommending that people switch to Ada, or Pascal, or what? Your pastebin seems to strongly favor Algol-likes. It would be nice to see some more substantive reasons why people should be using an Algol-like language in 2016. Just saying that C was originally hacked together to be expedient doesn't quite make that case.
There's a number of HLL's to choose from today with less safety issues. Start with an efficient one. If developing languages or compilers, then put effort into one that has a nice foundation instead of C. Similarly, there's prior work on C-like languages that had enhanced safety (Cyclone, Ivory) and tools like Softbound+CETS that automatically make code safe w/ a penalty. Use or invest effort in such tools. And finally, if nothing else is available, use a subset of C with sensible coding style and throw every bug-finding tool you can at it.
Those are the recommendations in that order. What you use depends on your domain, constraints, skills, and so on. Modula-2, Active Oberon, Modula-3, Free Pascal, and Component Pascal were probably closest to C's domain in terms of safe, efficient, easy-to-compile languages. Ada is systematic in all the errors it handles but with steep, learning curve. SPARK straight up proves your code. One can also use DSL's/4GL's that generate proper C w/ automated checks like iMatix and Galois do (and I once did). I've also seen Ocaml and LISP used for real-time, even embedded, systems with custom runtime. Ocaml was also used for tooling w/ source to object code verification.
So, there's a number of options. Each are behind C currently in tooling due to limited uptake and investment. More uptake and investment can fix that. Even so, average results of those tools have far lower defects than C code with shorter time to train people (ignoring Ada). That most teams aren't composed of geniuses in safe coding means that's important too.
ATS is interesting. I've seen it used in device drivers and one 8-bit system IIRC. The reason I left it off is that I'm not sure most people can learn how to use it. Whereas, we've taught ALGOL languages, Scheme, and Ocaml to all kinds of people where they're effective enough with types & assertions.
These more mathematical languages have to prove themselves out with real-world examples, algorithms and compiled code, before I recommend them. I'd like to see CompSci keep hammering away at them to find easier and easier ways of learning and using them. Not to mention improve tooling.
Oh yeah, you had another question: why ALGOL-like languages? I chose ALGOL-like over C-like specifically rather than over every style. The reason is that ALGOL-like languages had better design decisions:
1. Be closer to English to make them easier to read. Important as software is read more than written. Also can help in reviews.
2. Generally have proper features like modules or interfaces to aid programming in the large.
3. Still efficient enough for programming close to the metal.
4. Prefer sane defaults or automated checks on issues that trip programmers up. A good implementation usually lets you turn these off for a module if needed.
5. Tradition of thorough language reports and grammars means less undefined behavior.
6. Most common stuff is still efficient once compiled. If Wirth-like, then also efficient to compile.
7. Less tied to a specific processor style with its own style being more amenable to safe, maintainable operation. See Burroughs B5000 processor vs PDP-11's.
8. IMPORTANT. Easier to develop tooling for verification, esp static or dynamic analysis.
9. Easier to verify mathematically.
So, these are some things off the top of my head that various languages more ALGOL-like did better than C-like languages. I wouldn't say there was a single one that had all these traits but Modula-3 came close. A modified version of Go might as well given its predecessors were used in safe-language OS's. Hope this brainstorm gives you some idea of what things I valued vs C's style.
I suspect NASA chose C due to some combination of (a) the tiny verified subset they were using was secure; (b) it doesn't matter that much how secure your language is when you're spending countless sums to verify it anyway; (c) their people and/or their new hires had experience with C; (d) their systems were designed for it. None of these reasons are valid reasons to start a new, reasonably-budgeted, security-critical project in C.
Great response except for (b). The language choice matters. Some are too complex to verify in typical usage. Fortunately, despite its complexity, C was primitive enough for tools to catch up with a subset of it. One can overdo it. See INTERCAL. ;)