Rust has been rather monolithic. There is no such thing as a rust spec, just whatever the rust compiler does. This is a good first step to making it into a language people work in rather than just blog about.
A code-identical fork with a different name for trademark reasons is not going to make the Rust team write a spec. Look to the GCC Rust implementation for that.
Also, people already do work in Rust. It's in the Linux kernel right now.
Compiler behavior is the only thing that matters in any language, as per Hyrum's law. A spec is a useful tool for discussing and building a language, but don't confuse the map for the territory.
By passing it faked hardware. Yes, you have to write your APIs so they are testable. Yes, it is virtually impossible to retrofit unit tests into an old, large code base that was written without regard to testability. But no, it is not difficult at all to fake or mock hardware states in code that was designed with some forethought.
That may hold for a trivial device or a perfectly spec compliant device. However, the former is not interesting and the later does not exist. I agree that more test coverage would be beneficial, but I think your heavily downplaying the difficulty of writing realistic mock hardware.
Do you have experience doing this in C/C++? There are a bunch of things about the language models for both (e.g. how symbol visibility and linkage work) that make doing DI in C/C++ significantly harder than in most other languages. And even when you can do it, doing this generally requires using techniques that introduce overhead in non-test builds. For example, you need to use virtual methods for everything you want to be able to mock/test, and besides the overhead of a virtual call itself this will affect inlining and so on.
This doesn't even consider the fact that issues related to things like concurrency are usually difficult to properly unit test at all unless you already know what the bug is in advance. If you have a highly concurrent system and it requires a bunch of different things are in some specific state in order to trigger a specific bug, of course you CAN write a test for this in principle, but it's a huge amount of work and requires that you've already done all the debugging already. Which is why developers in C/C++ rely on a bunch of other techniques like sanitizer builds to test issues like this.
Right, doing interfaces that support DI would also force Linux to grow up and learn how to build and ship a peak-optimized artifact with de-virtualization and post-link optimization and all the goodies. It would be a huge win for users.
The fact that it would be hard to test certain edge cases does not in any way excuse the fact that the overwhelming bulk of functions in Linux are pure functions that are thread-hostile anyway, and these all need tests. The hard cases can be left for last.
For secular states. The coat of arms of the Kingdom of Jerusalem used gold on silver. The coat of arms of the papacy on the other hand is not the flag so they can do whatever they feel like there.
I’ve never seen a more positive response to a logo change.
I honestly can’t think of any positive response to a logo change lately. The flattening of Google’s word mark a decade ago is the most recent that comes to mind. I’m sure there are others, but the point is they’re almost never this well-received, imo.
Mathematics as shown in textbooks is not rigorous. I don't understand why so many people fetishize something they saw at university.
Formalised mathematics are incomprehensible to humans and orders of magnitude longer then anything you can see in textbook or mathematical papers outside automated theorem proving.
This reminds me of a tangential rant in the book "The Poincare Conjecture":
"... the postulates are unclear. Does postulate 2 mean that we can extend any line segment forever? Does it mean that we can cut up any segment? And if it means the first, who is to say that the resulting line is unique? And how seriously should we take the definitions? Are they just meant to provide guidance about a word that is essentially undefined (today's, and probably Euclid's, in-terpretation) or are they supposed to completely specify the object named? In the latter case, just what does the phrase "a breadthless length" mean?
Mathematicians and scholars know that there are gaps in Euclid, and there has been a great deal of discussion over the ages about alternate axioms, or possible additional ones. That has not stopped generations of worshipful school-masters, besotted with the majestic order, the accessibility and the patent usefulness of the Elements from rushing in and trumpeting it as the finest in human thought. However, to a thoughtful student, the Elements can seem less rational than capricious. The insistence that the Elements is flawless, and the apex of rigorous thought, turns some students away from mathematics. One wonders how much fear of mathematics stems from the disjuncture between the assertion that Euclid is perfect and some students' intuitive, but difficult to articulate, sense that some things in it are not quite right. Unless you are unusually rebel-lious, it is easy blame yourself and conclude that mathematics is beyond you.
It is worth bearing in mind that mathematical results, for all they are represented as eternal and outside specific human cultures, are in fact transmitted and understood within definite social and cultural contexts. Some argue, for example, that the Greeks invented proof in order to make sense of the statements of mathematical results of Babylon and Egypt without access to the context in which such results were used and discovered. In order to make use of the results, the Greeks needed to sort out different, seemingly..."
We might have different definitions my good sir. Granted I don't have a PhD in math and my math stopped at a masters. To me formalization is theorems and proofs, which are 100% comprehensible to humans. For reference: http://www.vdash.org/formal/#math
Everything in that link looks like programming, especially when you consider something like Haskell. Even formalism as a philosophy tries to add logic to natural language it self. So I am not sure where I am fetishizing what I saw at university. Care to explain without the snark?
Ah I did say they were proofs... that is my fault, what I mean to say is that formalization is the way they are written, is very much a language. Either way. Not sure where the venom was coming from
I never said they were proofs right? I am still confused. I was talking about foramalizing and how at least to me I see programming in it. Even something extremely complex like "The Strong Perfect Graph Theorem" is still readable and reads like english+programming. That's why I said haskell looks like it comes out of a discrete mathbook. I am not sure what your point is or where my fetishization is coming from.