Hacker News new | past | comments | ask | show | jobs | submit login

Probably not. One of JPEG XL's features is the ability to losslessly transcode existing JPEG images to JPEG XL while also reducing the file size:

https://jpeg.org/jpegxl/

Reducing the size of existing image collections with zero quality loss will make JPEG XL a success no matter what else happens.




To add to this, it's already possible to give this a try via building Brunsli from source:

https://github.com/google/brunsli

I personally use the cbrunsli/dbrunsli cmdline programs to archive old high-resolution JPEG photos that I've taken over the years. Having a gander at one subdirectory with 94 photos @ 354 MB in size, running cbrunsli on them brings the size down to 282 MB, which brings in savings of about 20%. And if I ever wanted to convert them back to JPEG, each file would be bit-identical to the originals.

Perhaps it's a little early to trust my data to JPEG XL/Brunsli, but I've ran tests comparing hundreds of MD5 checksums of JPEG files losslessly recreated by Brunsli, and have not yet ran into a single mismatch.

I can only say that I am very excited for the day that JPEG XL truly hits primetime.


Brunsli works very well, but is not compatible with the final JPEG XL format. For being able to reduce the binary size of libjxl we decided to use a more holistic approach where the VarDCT-machinery is used to encode jpegs losslessly. This saved about 20 % of the binary size and reduced attack surface. Now the decoder binary size is about 300 kB on Arm.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: