I just want a picture of a got-dang hot dog.
Everyone should just be using AV1 at this point. https://en.wikipedia.org/wiki/AV1
I assume you mean AVIF? Because AV1 is not an image (file) format but a video compression format (that needs to be wrapped in container file formats to be storable).
“AVIF is an image file format that uses AV1 compression algorithms.” yes i mean that
Avif is worse than jpeg xl hence why every company and organization has been implementing it
Given these positive signals, we would welcome contributions
Poor Google doesn’t have the manpower to implement it. They can only accept contributions from volunteers.
Don’t worry, they can spare some 20 percent time
Google is just a small indie company after all.
“we would welcome contributions to integrate a performant and memory-safe JPEG XL decoder in Chromium. In order to enable it by default in Chromium we would need a commitment to long-term maintenance.”
yeah
I.e. the existing implementation is not usable because it’s not written in rust
Or would they demand it in Go? Or have they abandoned that?
That might work but didn’t think go was
- as safe as rust
- built for CPU intensive operations (aside from potentially concurrent tasks)
Given these positive signals
Those idiots waited for 4 years because they followed the hype of the moment. I’m glad I removed Google from my life.
something tells me they wanted their own formats to catch on.
Absolutely, google does that shit constantly, well known within the internet standards community
That used to be what Microsoft (Internet Explorer) was famous for. I guess Chrome has lived long enough to be the villain, but Firefox is still the hero to me.
Sorry to break it to you, but you might want to start looking at Firefox forks.
This must be your first time seeing what Google support looks like
This is pretty standard unless you can get an exec’s personal attention.
The name of the format makes me think it’s regular jpeg, but bigger. Wouldn’t it be better to be smaller? 🤔
It’s even more confusing than that; the
Xis for revision 10, and theLis for long term.It’s an update to the JPEG standard intended to cover expected future uses and capabilities.
“10 LTS”?
No. They increased the max “canvas” size and increased encoding efficiency. You’d want the file size to be smaller but the file itself to be larger (and consequently more detailed)
Finally.
I would be more excited about JPEG XL if it was backward compatible. Not looking forward to yet another image standard that requires OS and hardware upgrades simply so servers can save a few bytes.
How would a new format be backwards-compatible? At least JPEG-XL can losslessly compress standard jpg for a bit of space savings, and servers can choose to deliver the decompressed jpg to clients that don’t support JPEG-XL.
Also from Wikipedia:
Computationally efficient encoding and decoding without requiring specialized hardware: JPEG XL is about as fast to encode and decode as old JPEG using libjpeg-turbo
Being a JPEG superset, JXL provides efficient lossless recompression options for images in the traditional/legacy JPEG format that can represent JPEG data in a more space-efficient way (~20% size reduction due to the better entropy coder) and can easily be reversed, e.g. on the fly. Wrapped inside a JPEG XL file/stream, it can be combined with additional elements, e.g. an alpha channel.
All you have to do is add a small traditional JPEG image at the start of the file. It doesn’t have to be high resolution or more than a couple of kb. The new format decoder would know this, and skip the traditional jpeg “header”, rendering the newer file format embedded in the image.
Would completely defeat the purpose of making a new smaller file format if we prefix if with the old format.
That would have been a brilliant move with wav vs MP3
If you’re really saving 20% in file size with XL, adding back a very compressed preview image that takes up one or two percent isn’t going to cost you much.
What does backward compability in image format even means? Being able to open it in windows image viewer?
It requires neither of those upgrades though? Unless you’re still using Windows XP I guess for some reason. It’s just an update to the image decoder
I just use old JPEGs. Not JPEG2000, not PNG, not WebP, not JPEG XL.
Feel free to use floppy disks. Btw if you are online, you use WebP and PNG all the time 🤣
Not if they use wget to only download the HTML!
Sir, don’t you dare encroach on those Lynx and W3M users. They don’t need no stinking images!
Lynx is the best browser.
I prefer offpunk.
If you are using Firefox:
- Enter the following in the address bar: about:config
- Search for: image.webp.enabled
- Set it to false Websites are delivering JPG/PNG instead of WebP again.
Maybe this should come with a warning. The purpose of WebP is to quickly serve images to the user without grabbing the entire image data. Without WebP all images will be fully loaded, in the right conditions a page could load real slow.
I love webp, but your explanation is a bit confused. Webp is typically lossy, just as jpeg — only, it’s more efficiently compressed, meaning smaller size for the same image quality. So there’s no ‘entire image data’, there are only different approximations of the original image and different compressed files. Full-blown lossless images in PNG or other formats take several times more data.
Disabling webp in favor of jpeg would use like 20-40% more data, in comparison. Which still sucks, but not as much.
I wasn’t going to get into the whole lossyness of the formats and just simplified to full image instead of compressed formatted. That is interesting that it is only saving 20%-40%. I was under the impression that the page only rendered the image size necessary to fit the layout and not the full resolution image. Forcing it to less lossy or lossless would mean that the larger image would always be available to be served to be rendered without any web request.
That’s a rather interesting consideration as to whether rendering at smaller sizes skips decoding parts of the image.
First, the presented file is normally always loaded in full, because that’s how file transfer works over the web. Until lately, there were no different sizes available, and that only became widely-ish spread because of Apple’s ‘Retina’ displays with different dots-per-inch resolution, mostly hidpi being two times the linear size of the standard dpi. Some sites, like Wikipedia, also support resizing images on the fly to some target dimensions, which results in a new image of the JPEG or other format. In any case, to my somewhat experienced knowledge, JPEG itself doesn’t support sending every second row or anything like that, so you always get a file of a predetermined size.
First-and-a-half, various web apps can implement their own methods for loading lower- or higher-res images, which they prepare in advance. E.g. a local analogue to Facebook almost certainly loads various prepared-in-advance low-res images for viewing in the apps or on the site, but has the full-res images available on request, via a menu.
Second, I would imagine that JPEG decoding always results in the image of the original size, which is then dynamically resized to the viewport of the target display — particularly since many apps allow zooming in or out of the image on the fly. Specifically, I think decoding the JPEG image creates a native lossless image similar to BMP or somesuch (essentially just a 2d array of pixel colors), which is then fed to the OS’s rendering capabilities, taking quite a chuck of memory. Of course, by now this is all accelerated by the hardware a whole lot, with the common algorithms being prepared to render raw pixels, JPEG, and a whole bunch of other formats.
It would be quite interesting if file decoding itself could just skip some part of the rows or columns, but I don’t think that’s quite like the compression works in current formats (at least in lossy ones, which depend on the previous data to encode later data). Although afaik JPEG encodes the image in rectangles like 16x16 or something like that, so it could be that whole chunks could be skipped altogether.
Your name is amazing
No, I have WebP blocked in my about:config. And I use Pale Moon, which actually blocks the things unlike modern FF. And I don’t load PNG either.
Do you also hit yourself in the nuts every morning to show the world how tough you are?
lmao
I find this interesting there an advantage to the older formats or is this just for compatibility with custom photo editing tools?
Compatibility is an advantage.
AVIF started heavily creeping in, too.
I’ve yet to see any AVIF in the wild. I think support for it is not quite there yet, everybody is still relying on WEBP.
I’ve seen a lot of avifs masquerading as jpegs lol (I know because KDE Dolphin for some reason isn’t showing a preview for those until I rename them)
Why though
Because I’m tired of all this nonsense where just because a thing is a mature technology, it’s considered obsolete. Stop constantly pushing for the next thing. Keep the things that work.
“How dare they invent a more efficient image encoding! Back in my day we had bmp and we liked it!” - grandpa simpson
So I tied an onion to my belt, which was the style at the time
I mean, BMP does still work as an uncompressed, artifact-free format.
you can have uncompressed png too
Sure. But you use bmp when you want to nuke your drive space for no real reason.
Wdym? This guy is still using punch cards
Webp is a smaller file size than jpeg for the same image quality in almost all circumstances - so it’s more efficient and quicker to load. It also supports lossless compression, transparency, and animation, none of which jpeg do. And the jpeg gets noticable visual artefacts at a much higher quality than webp does.
People didn’t adopt it to annoy you. It’s started to replace jpeg for the same reason jpeg started to replace bmp - it’s a better, more efficient format.
Webp is a smaller file size than jpeg for the same image quality in almost all circumstances
For lower quality images sure, for high quality ones JPEG will beat it (WebP, being an old video format, only supports a quarter of the colour resolution than JPEG does, etc.) JPEG is actually so good that it still comes out ahead in a bunch of benchmarks, it’s just it’s now starting to show it’s age technology wise (like WebP, it’s limited to 8bpc in most cases)
It also doesn’t hurt that Google ranked sites using WebP/AVIF higher than ones that aren’t (via lighthouse).
Edit: I should clarify, this is the lossy mode. The lossless mode gives better compression than PNG, but is still limited to 8bpc, so can’t store high bit depth, or HDR images, like PNG can.
Edit 2: s/bpp/bpc/
It is controlled by google tho
It’s unreasonable to stop further software development just because there’s a ‘mature’ solution around. Besides, just because a solution is ‘mature’ doesn’t make it good.
And considering that it seems like you can still use the original, about 30 year old format, doesn’t look like there’s any harm for the folks not needing or able to use the new stuff.
you know, using a better encoding is better for your dial-up internet too
these damn kids will wake up on day and go, “why do you need xpg? jpgxl is just fine!”
they don’t realize it yet that the only reason why jpeg xl exists is to silently slip that corpo collar around their necks.
🤷 only time can feed wisdom and cure stupid.
Oh yeah? Well I named my firstborn child JPEG!















