webp was around since 2010 and has google's backing. Microsoft tried their own thing (Jpeg XR; 2009) and the actual JPEG committee announced their work Jpeg XL (with google backing) a while ago, while Apple switched to HEIC/HEIF as of 2017 (HEIC is like webp just based on HEVC aka H.265 instead of vp8 i-frames; and it's a patent mess of course).
There is simply no reason why people in 2016 or now would be interested in a format from 2000 that was a patent minefield until at least 2016.
Except for JPEG the alternatives are also patent minefields...
Are people just more cavalier about the patent risk these days? The problem with JPEG2000 wasn't the patents we knew about, it was the possibility of submarine patents. People were still wary after the GIF debacle. Nobody wanted to be charged $0.05/image after the fact when they've delivered literally billions of images. Plus the courts were seen as very favorable towards patent holders, even when they were acting in bad faith.
That's kind of scary for something developed in the 90s. It was originally run on Pentiums, K6s, PPC 604s, and the like and it's still too expensive for a Ryzen 7?
Yep. There's a ton of data dependencies, where e.g. you can't begin decoding the next bit until you've finished decoding the current bit. It's all about progressive decoding so there's multiple passes over each group of wavelet coefficients, accessed in a data-dependent sequence. Each wavelet transform involves accessing and interleaving the same 2D arrays in both row- and column-oriented fashions.
These design decisions all made sense when clock rates were exponentiating, but they're all nightmares now that we rely on branch prediction and memory prefetching and superscalar execution units. The codec is simply not a good fit for the computing architectures we have today.
> These design decisions all made sense when clock rates were exponentiating, but they're all nightmares now that we rely on branch prediction and memory prefetching and superscalar execution units. The codec is simply not a good fit for the computing architectures we have today.
Arguably not a good choice for the year 2000, either, considering that all high performance CPUs at that time were out-of-order, superscalar and deeply pipelined.
Images got bigger... it’s not that it’s too expensive it’s just way more expensive that good enough alternatives. There are other reasons too but that’s a surprising one
Edit: Did some more research and the patent risk appears to have passed as of 2016. Still nobody seems to have interest in JPEG2000.