DO11: Horrible JPEG viewing quality (blocking)

I'd want to know more about what libjpeg-turbo is doing before assuming the output is better.

Blurring the jpeg data might make things look better on images that are over compressed to the point of showing extreme blocking, but it might make high quality images look worse by blurring their details.

Maybe it's more intelligent than that, but so far we only know it makes one image look better.

I'm also still wondering why you see smoothing in apps that I've checked and seen blocking in. I am not sure about your assumption that everything is using libjpeg-turbo based on that, since I see different results in some of the same apps. (I suspect different color management settings are confusing the issue, since the colors are extremely different depending on the app used, and sometimes that washes out details in the red dress which also has the effect of making the blocking harder to see.)

Another thing to note is that you can just save your images with higher quality compression if you don't want them to have huge JPEG artifacs. The artifacts are in the images, it's just a matter of how much the image data is blurred to hide them. (Unless there's something even stranger going on, like a JPEG encoder that's producing data that libjpeg-turbo decodes properly but the reference decoder does not, maybe by accident because the author only tested highly compressed images with libjpeg-turbo.)

At the moment there are too many unknowns for us to switch from libjpeg to libjpeg-turbo, but it's a possibility if we can get answers to the questions and prove to ourselves that libjpeg-turbo is properly better (and not just blurring what it outputs).

Thanks for the apology. Maybe try harder to not abuse us next time we point out doubts in the conclusions you've jumped to. :slight_smile:

Well, that was a royal waste of my time to prove a point, but here it is.

All of the applications mentioned screenshotted as PNG (to preserve quality), cropped, zoomed x2 (linear/pixel resize), saved as PNG, zipped, and also spread out onto one big image in Paint Shop Pro and saved as PNG.

If you think there's significant (if any) differences between all of the apps except Photoshop and Dopus then i suggest you unzip that .zip and view them in Dopus and press page-up/down through them all. Get in real close to your monitor and look at them.

Everything is smoothed over pretty nicely except Photoshop and Dopus, and those don't even look the same. Dopus is all by itself with its own unique blocky output.

Yes they all are somewhat blocky (it's a bad source JPEG) but there's no denying the extremely similar (if not exact) smoothing happening in every other application. They're all basically identical.

Also, as you can see, embedded colour profiles make no difference.

Whew...



all jpeg decompression examples.zip (220 KB)

(FWIW, I checked Photoshop CS4 and it produces identical output to your CS6 image.)

I think Opus smooths things more than Photoshop, placing the standard libjpeg with default settings somewhere between whatever Photoshop is using and libjpeg-turbo. for this one particular image:


I'm not sure what any of this tells us, though, except that with one horrendously over-compressed image that is full of blocking artifacts no matter which decoder is used, you get different amounts of smoothing in different decoders (but the image is still full of artifacts either way, and the smoothing actually removes detail as well, which may degrade image quality in other images).

That still isn't a solid case for changing the decoder in Opus.

Yeah, you know, you're right. It is a horrendously over-compressed image.

I wouldn't have even noticed it if it weren't for downloading some Facebook albums and then thinking that the downloader i was using was somehow getting the low-quality versions because Dopus was showing horrible pictures compared to what i was seeing in Chrome.

Oh, wait, did i just admit i was ripping soft Facebook "porn" (lol censored again)? :wink:

Anyway, i can assure you, it's not an isolated incident. There's entire albums, nay, entire profiles with images with quality like this. I don't have to do that screenshotting and cropping for all of them do i? :wink:

It's not really a strong case for changing the decompressor, but it does show that there's 3 groups of image output: Dopus, Photoshop, and pretty much everyone else. Are you sure you guys want to be the odd one out? It's up to you, i guess. There's not much more i can do to change your mind.

I will say though that from what i've (and we've) seen, jpeglib-turbo has a high adoption rate. It's also a very good fast library. As you roughly even said, it's likely that jpeglib-turbo is smart enough to not perform smoothing on high-quality images. I remember reading somewhere, for example, that images with quality >=96 are forced to use a certain IDCT method.

Food for thought.

You could always write a viewer plugin for Opus that uses libjpeg-turbo.

That's an excellent idea and i actually considered it already! But i couldn't find any of the API docs, although i didn't search for long. Could you point me in the right direction?

lol nevermind there's a section in the forums for it. Sorry. :blush:

Well, it's been a busy day but i managed to squeeze a good chunk of time in over the afternoon and this plugin is more or less done.

It works i guess.

Obviously it's displaying the Facebook pic example smoothed over like all the other apps. So yeah, your low-quality Facebook girls can now go from being sharply blocky, to smooth.. And still blocky. :open_mouth:

Since it's jpeglib-turbo, i thought i'd run some speed tests. Bit difficult to, i'm not really sure how to do it best. At first i thought there were no differences in speed, but it didn't take me long to remember that the greatest amount of time spent displaying an image is in its resampling algorithm, provided of course that it's bigger than your screen and needs resampling down.

So, after setting Dopus to not change the zoom level (ie: display images at 100% original size with no resizing), i ran some tests.

The test involves loading up the first picture in a folder of huge JPEGs (30-40MB each), and then holding page-down until it displays a certain picture (the 8th, in fact) and then releasing the key.

I made a graph of the CPU usage. The first part of each section is the initial loading, then the second part is the page-down-ing. Each pixel is 0.5 seconds. First i ran the test with the jpeglib-turbo plugin enabled, and it did 27 pixels (13.5 seconds) of CPU usage. Then i disabled the plugin and re-ran the test. It did 38 pixels (19 seconds).

Then i ran the test again, the other way around. 39 pixels (19.5 seconds) for Dopus built-in, and then 29 pixels (14.5 seconds) for turbo plugin.

Unless i have really stupid random 5-6 second delayed reflexes, then i'd say the test concludes that jpeglib-turbo is quite a tad faster. Dopus's built-in JPEG takes ~37.5% extra time to decompress the same JPEGs. Going from huge-JPEG to huge-JPEG does feel faster and more fluid using the turbojpeg-lib plugin.

So, given that your blocky Facebook girls will be smoother, and your JPEGs will decompress in 72% of the time Dopus takes, or Dopus will take 37.5% longer to decompress them, depending on which way you want to look at it... Does anyone want this plugin?

It seems kind of a waste to keep it all for myself. On the other hand i really have no idea how to release plugins, and i don't know if there's any demand for it. Could Leo or Jon point me in the right direction to releasing a plugin? It goes in the plugins sub-forum i guess. Does there need to be source code for it? My code is embarrassing, but if i must release it then i guess i will and you can all laugh at me. :laughing:

But it all depends on if anyone actually wants the plugin anyway. I'd have to tidy it up a bit first too. If you use it and you don't feel it speeds you up, then feel free of course to not use it.

Anyway.. Interesting little adventure. Rock on chums. :thumbsup:

I want jpgs to be displayed as fast as possible of course..o) We had a thread about preloading such images, but this won't be done in dopus anytime soon, so whatever speedups image browsing is welcome.. o) I don't know how it will blend into the fullscreen/viewerpane we have right now to view pictures, I mean there a some very handy options available like cropping and saving as, which I use regularly. If these are not available.. the plugin is of less use for me than the current, even if it is noticebly faster.

I'm sure people will be interested in trying it.

To release a plugin you just need to put it (or a link to it, up to you) in the plugins area for download.

Source code is nice but not required. 32-bit and 64-bit builds are good, if you're able to make & test both. Apart from that, all people really need is the DLL(s) and a brief description of what the plugin does.

Is it released?.. o)

Hey guys, sorry about the delay, i've been so busy.

There's also a slight hitch.. Unless i want to identify JPEG files myself manually with a custom implementation, which would just be a pain and i'm not sure if it'd even work, i have to use jpeg_read_header(), and this function only returns valid data if it can read the whole header, which includes the thumbnail if there is one embedded. This makes identifying some JPEGs without reading like 64KB (maybe slightly less or a lot more, but it's roughly the most i had to read to get 100% of my own JPEGs to show) of data impossible without a lot of custom work.

I'm unsure as to the implications of reading a lot of data from every single IStream that Dopus gives me to identify. I can imagine that identifying a JPEG file with .jpg/.jpeg extension will require reading the whole JPEG embedded thumbnail or image eventually anyway, to generate a thumbnail or to just view it. My plugin only seems to get called to identify files when you're pretty much actually going to see the image on your screen, even if as just a thumbnail. The only case i'd think it a problem is if you're doing a "details" listing of a folder and you just want the image dimensions to show in a column. It's going to read potentially a big chunk of data (ie: the embedded thumbnail) from every file just to show you the image dimensions. That might be a bit slow.

Anyone got any opinions on that? This is getting into the realm of plugin development, and not support, so i really didn't know if i should talk about this on this subforum. Anyway if there's no objections to reading embedded thumbnails on every .jpg/.jpeg file identification even if you're not showing the thumbnail, then i'll just go ahead with the plugin release.

Thanks for the demand (tbone) and direction (Leo) and idea (Jon) though guys. It's really a lot of fun to actually be contributing something to a community. It's my first time! I just might take a day or two more to get it perfect because i want to get it right the first go. :smiley: