I'm programming for the iPhone and I hav开发者_C百科e an 3-channel UIImage taken from the iPhone camera. I'm trying to get the RGB values for different areas on this image. I currently cross-reference the RGB outputs I get from the iPhone with the digital color meter that comes with Mac OSX.
Most of the values I obtain are fine, however for certain colors, the RGB values that I output vs. what the digital color meter read are very different.
For example, in the following link, I show an example of a square whose color that I calculate is different from the calculated value with the color meter.
http://www.learntobe.org/urs/square.php
Our calculated RGB from the iPhone is (41, 116, 86) for this square (also validated with the 'Color Expert' application. The value calculated by the Apple Mac OSX color meter was measured to be (0, 121, 87).
Clearly, the R value is really off. All areas where there are color differences seem to be because of a huge discrepancy in the R values. Is there a specific reason for this?
Thanks for your help in advance!
This is to be expected.
iOS is not color managed, but Mac OS X is. This means that Mac OS X takes an image value of (255, 0, 0) and transforms it to a good match for the current display.
It is done so that you can have two displays, view a copy of an image on each display and both images will appear the same. For some pair of displays, (255, 0, 0) on one display may look the same as (233, 89, 31) on the other.
精彩评论