The README.md is 9k of dense text, but does explain it: faster, more efficient, more accurate & more sensible.
Rust port feature: The implementation "passes 93.8% of Mozilla's test suite (122/130 tests)" with full document preprocessing support.
Test interpretation/sensibility: The 8 failing tests "represent editorial judgment differences rather than implementation errors." It notes four cases involving "more sensible choices in our implementation such as avoiding bylines extracted from related article sidebars and preferring author names over timestamps."
This means that the results are 93.8% identical, and the remaining differences are arguably an improvement.
Further improvement, extraction accuracy: Document preprocessing "improves extraction accuracy by 2.3 percentage points compared to parsing raw HTML."
Performance:
* Built in Rust for performance and memory safety
* The port uses "Zero-cost abstractions enable optimizations without runtime overhead."
* It uses "Minimal allocations during parsing through efficient string handling and DOM traversal."
* The library "processes typical news articles in milliseconds on modern hardware."
It's not explicitly written but I think it's a reasonable assumption that its "millisecond" processing time is significantly faster than the original JavaScript implementation based on these 4 points. Perhaps it's also better memory wise.
I would add a comparison benchmark (memory and processing time), perhaps with barcharts to make it more clear with the 8 examples of the differing editorial judgement for people who scan read.
I was responsible for third party e-mail clients able to connect to Exchange, it was decided Thunderbird was allowed and support was implemented. It can be done if people are aware of the needs, can implement it securely and can evaluate risks.
For me it's like the pebble in smart glasses land, simple and elegant.
Less is more, just calendar, tasks, notes and AI. The rest I can do on my laptop or phone (with or without other display glasses).
I do wish there's a way to use the LLM on my android phone with it and if possible write my own app for it. So I am not dependent on the internet and have my HUD/G2 as a lightweight custom made AI assistent.
Strictly speaking the mobile Oculus/Meta Go/Quest headsets were linux/android based, you can run Termux terminal with Fedora/Ubuntu on them and use an Android VNC/X app to run the 2D graphical part. But I share your SteamOS enthousiasm.
That's great and that's what I aspire, but as it's so easy and quick typing and sending a mail I just send it like that. I remember the days before when I hand wrote the occasional letter and delivered it myself or sent it by post.
Would you consider handwriting a letter and then fax2email it also an option, if not why not? Writing a letter can be much more intentional, but the sending process could be automated.
I remember I bought a german book with bundled talks/essays at the Goetheanum bookshop last year about how to relate to the digital revolution. Distracted by the internet I haven't had time yet to read the book.
"Das Ende des Menschen? Wege durch und aus dem Transhumanismus" (The End of Man? Ways Through and Out of Transhumanism), edited by Ariane Eichenberg and Christiane Haid.
Often I'll include a stamped postcard, addressed to my PO Box, because I think there is something important about paying for the privilege to communicate with somebody off-line [the stamp]. It forces your message to be more concise/worthwhile.
There is also something sweet about having a built-in delay for the message to "gestate" — perhaps if politically-related, your point is even further reinforced as "prescient," as the pre-dated postmark attests (upon delayed arrival). Perhaps you're wrong and wasted a stamp.
----
Mostly I agree with (I believe) P.G.'s premise that email is nothing more than a to-do list that anybody can add on to. I do not wish to ever be immediately reachable, again, and this is an expensive freedom/lifestyle.
I am simply too angry to have access to a system [email] where I can immediately tell anybody in the world how I feel about something [and did for a quarter-century]. If something really bothers me, it has to be worth a postage stamp (I usually write postcards, but also have thousands of FOREVER Stamps™).
Athanasiou, Tom (1985). “Artificial intelligence: cleverly disguised politics”. In: Compulsive technology: computers as culture. Ed. by Tony Solomonides and Les Levidow. Free Association Books, pp. 13–35
RSA may well be be deprecated long before quantum computing can break it. Post-quantum cryptography standards are already being deployed:
GnuPG 2.5 has introduced support for Kyber (ML-KEM, FIPS-203).
Kyber is added in in OpenSSH 9.9 (September 2024) as a hybrid post-quantum key exchange algorithm and made the default key exchange mechanism in OpenSSH 10.0 (April 2025). The implementation uses a hybrid approach, combining classical cryptography (X25519) with ML-KEM-768 for key exchange (mlkem768x25519-sha256).
OpenSSH plans to add support for post-quantum signature algorithms in the future. OpenSSL 3.5.0 (April 2025) supports ML-KEM, ML-DSA and SLH-DSA.
https://medium.com/asecuritysite-when-bob-met-alice/a-long-g...
The missing context in the post by John is an exposition of why progress has been slow - decoherence and error rates - and what the rate of progression can be when these obstacles have been resolved. Shor's algorithm requires fault-tolerant quantum computing, which didn't exist in any form until recently.
Tphysical error rate must be lower than the threshold required. Different research groups and companies, using various qubit technologies have already demonstrated techniques and elements of below-threshold error correction.
https://www.nature.com/articles/s41586-024-08449-y
Agreed on your last point. It's fairly obvious I think that we won't factor 1024 bit numbers by any direct evolution of the techniques used in smaller circuits. The hope is that you reach a threshold (error correction) where a completely different regime is unlocked by allowing you to implement a fundamentally different technology.
Rust port feature: The implementation "passes 93.8% of Mozilla's test suite (122/130 tests)" with full document preprocessing support.
Test interpretation/sensibility: The 8 failing tests "represent editorial judgment differences rather than implementation errors." It notes four cases involving "more sensible choices in our implementation such as avoiding bylines extracted from related article sidebars and preferring author names over timestamps."
This means that the results are 93.8% identical, and the remaining differences are arguably an improvement. Further improvement, extraction accuracy: Document preprocessing "improves extraction accuracy by 2.3 percentage points compared to parsing raw HTML."
Performance:
It's not explicitly written but I think it's a reasonable assumption that its "millisecond" processing time is significantly faster than the original JavaScript implementation based on these 4 points. Perhaps it's also better memory wise.I would add a comparison benchmark (memory and processing time), perhaps with barcharts to make it more clear with the 8 examples of the differing editorial judgement for people who scan read.
reply