The aim is absolutely to implement the entire ECMAScript specification. Progress has slowed down recently, as I've been both busy with other things and tied up in making the engine work with interleaved GC.
A secondary aim is to have a bunch of feature flags that allows the engine to drop out support for specification parts that a particular embedder doesn't care about. That obviously fights with the "implement the entire ECMAScript specification" goal, but I just hate indexed property getters and setters with a passion and want to see them gone wherever I go.
Boa is a great project and I believe it is being used in some production systems. I've met and exchanged some ideas with the main developer, Jason Williams, and even received the greatest praise that I could imagine: Boa will (or did?) take some inspiration from Nova on its GC refactoring. Nova has also copied (with proper attribution of course) a few minor parts from Boa, like whitespace skipping code for some spec abstract operations.
I highly recommend keeping an eye out and using Boa if you have the chance.
Chinese already did a similar experiment few years ago and the result was that "plants can grow on the moon despite the intense radiation, low gravity, and prolonged intense light"
> Over the next eight days, this payload conducted a vital experiment where it attempted to grow the first plants on the moon.
The plants survived eight days before freezing, but important questions also include things like "How does the radiation impact their seed viability in future generations?"
I'll grant that they didn't immediately die, but neither would I have expected that from an ionizing environment. Just a lot of weird quirks in lifecycle.
"Deinococcus radiodurans is capable of withstanding an acute dose of 5,000 grays (Gy), or 500,000 rad, of ionizing radiation with almost no loss of viability, and an acute dose of 15,000 Gy with 37% viability.[14][15][16] A dose of 5,000 Gy is estimated to introduce several hundred double-strand breaks (DSBs) into the organism's DNA (~0.005 DSB/Gy/Mbp (haploid genome)). For comparison, a chest X-ray or Apollo mission involves about 1 mGy, 5 Gy can kill a human ...."
Some enterprising researchers must have considered engineering this microbe to produce useful products in space, but I don't travel in these circles anymore.
But it's just a side effect of being an extremophile with really good DNA repair machinery. The microbe isn't intentionally resistant to radiation. It's not adapted specifically to radiation environments.
I read about an effort to do this in the 1950s (IIRC, it was in Pawpaw: In Search of America's Forgotten Fruit by Andrew Moore, but I could be wrong about that) and as I remember it, most of the radiated seeds were either sterile or produced deformed offspring.
What's the difference between atomic gardening and regular selective breeding performed under the giant ball emitting ionizing radiation that we have overhead half the day except the rate at which mutations occur? Plants with terrible nonviable mutations might be entirely sterile even if we like them, plants with viable but undesirable mutations we won't propagate into another generation. It seems akin to modern GMO efforts with a shotgun instead of a scalpel, but it did work.
Plants also handle mutations differently, creating burls and cavities and whatnot instead of it taking over the entire existing plant like cancer does in animals. You're unlikely to generate a Plants vs. Zombies scenario here.
Might as well say that beating grapes into pulp without beating grapes of the consumers gives juice an opportunity for one-sided evolutionary advantage.
While it sure sounds straight out of some 50s horror movie, I have a feeling the consequences here are pretty insignificant. The mutant tomatoes I've harvested and eaten from my garden have been quite tasty. Any particular fears in mind?
People doing it everywhere around the world for almost a century now, effectively is. Unsurprisingly for anyone whose understanding of science extends beyond cheap comic book tropes, everything is fine.
Radiation isn't evil magic, mutations don't give superpowers. Both are natural phenomena, and they're not anything like they're portrayed in comic books.
Might as well worry about watering your plants. Plants are perfectly fine, they live and grow by nature magic, no need for humans to play god and add water to the mix, what could possibly go wrong?
This is what nature keeps doing for billions of years - we have constant background radiation, some stuff from sun which still gets through, and lets not forget about everybody's favorite cosmic rays. The most energetic particle we detected had energy of baseball ball thrown at 100kmh. I'd say this is the main fuel of whole evolution of life on Earth, on top of drastically changing environments.
You can't build 100% radiation-shielded environment, anywhere. Neutrinos just don't care that much about obstacles (and interact very weakly with target, but they still do in small numbers, that's how we detect them).
on the scale that nature does it, the consumers of plants also evolve.
I can't believe what I'm being asked to argue here, it's "environmentalism" and "public health" and "anti big X" all rolled up into one. I'm on the other sides of all those issues, so I wish you'd all get back in your lanes.
Honesty the biggest what could go wrong is things like vegetables will stop producing the useful large fruits we eat if we're trying to grow things for food.
You can keep something alive for a week in a terrarium basically anywhere, I'm not even sure their result is interesting if it weren't for the fact that it was on the moon.
That same lander (Chang'e 4) also measured the radiation dose rate on the moon. It's about 2.6x that of the ISS. Doesn't account for solar particle events, which they didn't encounter any.
I would bet a million bucks that Jobs put that price in because he basically said well if they buy the Linux version we're down one Mac sale from them so charge them our profit margin on a Mac Pro.
That's not really comparable. Raspberry Pi added entirely separate RISC-V cores to the chip, they didn't convert an ARM core design to run RISC-V instructions.
What is being discussed is taking an ARM design and modifying it to run RISC-V, which is not the same thing as what Raspberry Pi has done and is not as simple as people are implying here.
VS Code does become slower for me on my MacBook M1. I typically don't restart the app unless I have to. After few days of usage, scrolling the code area becomes so slow (<20fps) that I have to restart the app. I haven't experienced such slowdown in other IDEs, such as Xcode or Qt Creator.
I wonder if there are any other native IDEs with the same level of functionality. I feel like electron is forcing me to compromise on performance in exchange for features.
The most important features for me would be responsiveness (I usually have multiple projects open simultaneously, and performance tends to degrade after a few days) and memory consumption.
This is the same Google that wrote the libgav1 decoder instead of using dav1d, they don't have a strong history of starting from code that was invented elsewhere.
(In the case of AV1 I think they eventually gave up and started offering dav1d on Android, but not before shipping their in-house implementation that was much less efficient and no safer than state of the art, perplexing everyone in the process.)
It should be noted that all known independent JPEG XL decoders (especially my J40 [1] and jxl-oxide) were primarily written by a single individual for each library. So the request is that some library, either the future version of jxl-oxide or Rust rewrite of libjxl or something else, should be written in a provably memory-safe manner, actively maintained by JPEG XL developers, and successfully reproducing the original libjxl's decoding performance. I think it is indeed a fair deal to both Mozilla and JPEG XL developers.
[1] The main blocker for J40 was that I don't really want to keep it less safe than I hope to achieve, but I also want to keep it in C for practical reasons. This and my daily job prevented any significant improvements so far.
For one thing, I believe jxl-oxide is not yet performant enough to replace libjxl's decoder (not necessarily because of a difference between C++ and Rust, however).
I think it would make more sense to support the existing implementation than to start yet another one.
As for why jxl-oxide can't be used yet — it just isn't mature enough yet. They're still finding unimplemented or incompatible features, and have some unoptimized code.
JPEG XL is big and complex – it is designed to have every feature of every competing codec (from vector graphics to video), and beat them all on compression in every case, so it's a half a dozen of different codecs in a trench coat.
May be deep down somewhere a large motivation for coding in rust is to invent something here even if that includes reinvention. The memory safety is a by-product.
macOS 14 / Safari 17.6 shows static pics, but no animation.
Honestly, I wish all animated image formats would just die. They are an inefficient way to animate images, because they only use intra frames. H.264, H.265 or AV1 should be used for animations. Fortunately, Safari can display video files in <img> or CSS images, which makes all animated image formats unnecessary.