I personally use the cbrunsli/dbrunsli cmdline programs to archive old high-resolution JPEG photos that I've taken over the years. Having a gander at one subdirectory with 94 photos @ 354 MB in size, running cbrunsli on them brings the size down to 282 MB, which brings in savings of about 20%. And if I ever wanted to convert them back to JPEG, each file would be bit-identical to the originals.
Perhaps it's a little early to trust my data to JPEG XL/Brunsli, but I've ran tests comparing hundreds of MD5 checksums of JPEG files losslessly recreated by Brunsli, and have not yet ran into a single mismatch.
I can only say that I am very excited for the day that JPEG XL truly hits primetime.
Brunsli works very well, but is not compatible with the final JPEG XL format. For being able to reduce the binary size of libjxl we decided to use a more holistic approach where the VarDCT-machinery is used to encode jpegs losslessly. This saved about 20 % of the binary size and reduced attack surface. Now the decoder binary size is about 300 kB on Arm.
https://jpeg.org/jpegxl/
Reducing the size of existing image collections with zero quality loss will make JPEG XL a success no matter what else happens.