开发者

Do the libjpeg and the .Net jpeg codec really differ significantly on monochrome data?

开发者 https://www.devze.com 2023-03-05 00:29 出处:网络
I work with a lot of monochrome image data and this morning I noticed that there appears to be a significant difference between the way libjpeg and the .Net jpeg codec handle monochrome data. It appea

I work with a lot of monochrome image data and this morning I noticed that there appears to be a significant difference between the way libjpeg and the .Net jpeg codec handle monochrome data. It appears that a monochrome image saved at ANY quality setting using libjpeg and opened using the default .Net jpeg codec actually only loads with 16 different shades of gray and all intermediate shades are rendered as stippled.

Here is the histogram of a smooth gradient saved by libjpeg and loaded by .net

Do the libjpeg and the .Net jpeg codec really differ significantly on monochrome data?

The histogram should have been perfectly level.

And here is a (zoomed in) sample of what that gradient looks like (it should a be perfectly smooth transition)

Do the libjpeg and the .Net jpeg codec really differ significantly on monochrome data?

This should be a smooth transition from gray 85 on the left to gray 136 on the right but only 4 shades of gray are actually rendered to make that transition.

My question is am I crazy and if not just how far does this codec discrepancy go? Is there a good workaround if you are using both libraries in different programs?

I am not blaming either codec, just pointing out what appears to be a discrepancy. I noticed this with images I knew were created using libjpeg, assumed it was a quality setting issue, tried using faststone image resizer to create test images and got the same result, tried using irfanview and got the same result again. As both of those programs must use some jpeg library I tend to assume they are also using libjpeg and there is a genuine codec conflict.

On the loading side I have run into the same result l开发者_开发百科oading images with both my own .net code and using Paint.net.

Finally, here is a sample at normal resolution so you can download it and try it yourself. Loading it in some programs will give you a nice gradient (e.g. your browser) but loading it with your own .Net code, or Paint.Net will give you a dithered gradient like the above rendered using only 16 shades of gray.

Do the libjpeg and the .Net jpeg codec really differ significantly on monochrome data?

Does anyone know more about this, how far it goes and what good workarounds might be?


I am able to reproduce your symptoms by opening your sample image in the version of Paint bundled with Windows 7. Analyzing the file shows it to be a valid JPEG, and having it show up properly in the browser confirms this. It looks like Microsoft really messed up bad. It has already been submitted as a bug:

https://connect.microsoft.com/VisualStudio/feedback/details/597657/grayscale-jpeg-image-read-as-format8bppindexed-but-quantized-as-format4bppindexed#details

http://social.msdn.microsoft.com/Forums/en-US/wpf/thread/9132e2bd-23cc-4e5a-a783-1fa4abe11624/

The workaround would be to create your JPEG as a full color image, but only put grayscale pixel values into it.

0

精彩评论

暂无评论...
验证码 换一张
取 消