JXL is an image codec - it can afford to be less efficient (it's not! Other way around, rather) and not be hardware accelerated as its typical use case is not to present 30-60 images per second like in a video codec, it will not affect the battery life of a device in any meaningful way. Also, AV1 hardware decoding is far, far from ubiquitous so many users would not benefit at all from it.
But - back to JXL vs WebP:
I think Google had genuinely good intentions with WebP, but the effort was somewhat ruined by their culture: they relied too heavily on metrics which aren't always a good proxy for image quality, because humans looking at pictures don't scale, and Google does things that scale. We now have a codec with good metrics, but looks poor.
It's based on the intraframe coding of the VP8 format - a video codec - and I think it suffers from that. Looks OK in a video, but bad in stills where you have more time to notice its warts
Most importantly, it's almost always produced by recompressing a jpeg and causing a generation loss. I don't know of any phone or camera which produces native WebP (maybe some recent pixels? Dunno), and any professional device used in RAW mode usually implies the participation of someone who cares about the finished product and will not want WebP (and will resent when it's used without their consent by webmasters wishing to tick a box in pagespeed, as the author mentions). JXL has a lossless recompression mode in which it just replaces the Huffmann compression stage of an existing JPEG with something more modern, and this results in a pixel-accurate image which is 20% smaller than the original file - this already eat WebP's claimed space saving, and then some, with no generation loss. Based on this fact alone, there shouldn't even be a discussion.
....but let's have a discussion anyway. A JPEG -> JXL lossless recompression isn't conceptually new - Stuffit (remember them?) did it in 2005, with not enough traction sadly (unsurprisingly since there were patents and licensing costs). Basically it's _still_ a JPEG - if you decompress the final stage of a JPEG, and the final stage of a JXL (or a .SIF), you get the exact same bytestream. While yet another amazing testament of JPEG's longevity and relevance, it is also concerning: How could Google do worse than that??? When basically rezipping (with a modern algo) the existing DCT macroblock bytestream of a 30 year old codec beats your new codec, you should just trash it.
Edit:
...but I forgot to answer your question. Why is JXL viewed so favorably on HN? Because it doesn't suck, and we're sad that Google decided to be a roadblock, and pushing for their own thing, which instead sucks. At least AVIF is way better than WebP, even though it's a monster, computationally.
What you're ignoring is that WebP is from the year 2010. JPEG XL is from 2022. Incidentally, JPEG XL is also a Google project, making your ranting about how bad they're at image formats pretty funny.
Hi!
I'm aware that JXL partially originates from Google's PIK - and also Brunsli??, but I had indeed forgotten that WebP started in 2010, wow, 13 years old already.
I'll therefore correct my statement: "How could Google do worse than that??? When basically rezipping (with a modern algo) the existing DCT macroblock bytestream of a 18 year old codec beats your new codec, you should just trash it."
Also, Stuffit's SIF format is still 5 years prior to 2010 so that point stands.
I didn't compare with stuffit. If it's better than JXL recompression, perhaps they had put more focus on lossless recompression. Perhaps they had less realtime constraints in decoding speed.
But - back to JXL vs WebP:
I think Google had genuinely good intentions with WebP, but the effort was somewhat ruined by their culture: they relied too heavily on metrics which aren't always a good proxy for image quality, because humans looking at pictures don't scale, and Google does things that scale. We now have a codec with good metrics, but looks poor.
It's based on the intraframe coding of the VP8 format - a video codec - and I think it suffers from that. Looks OK in a video, but bad in stills where you have more time to notice its warts
Most importantly, it's almost always produced by recompressing a jpeg and causing a generation loss. I don't know of any phone or camera which produces native WebP (maybe some recent pixels? Dunno), and any professional device used in RAW mode usually implies the participation of someone who cares about the finished product and will not want WebP (and will resent when it's used without their consent by webmasters wishing to tick a box in pagespeed, as the author mentions). JXL has a lossless recompression mode in which it just replaces the Huffmann compression stage of an existing JPEG with something more modern, and this results in a pixel-accurate image which is 20% smaller than the original file - this already eat WebP's claimed space saving, and then some, with no generation loss. Based on this fact alone, there shouldn't even be a discussion.
....but let's have a discussion anyway. A JPEG -> JXL lossless recompression isn't conceptually new - Stuffit (remember them?) did it in 2005, with not enough traction sadly (unsurprisingly since there were patents and licensing costs). Basically it's _still_ a JPEG - if you decompress the final stage of a JPEG, and the final stage of a JXL (or a .SIF), you get the exact same bytestream. While yet another amazing testament of JPEG's longevity and relevance, it is also concerning: How could Google do worse than that??? When basically rezipping (with a modern algo) the existing DCT macroblock bytestream of a 30 year old codec beats your new codec, you should just trash it.
Edit: ...but I forgot to answer your question. Why is JXL viewed so favorably on HN? Because it doesn't suck, and we're sad that Google decided to be a roadblock, and pushing for their own thing, which instead sucks. At least AVIF is way better than WebP, even though it's a monster, computationally.