I'm just starting to learn about Quartz Composer and the first thing I would like to create is a composition that could eventually be used in a Cocoa application which would accept a black and white image and two NSColor's and change the black pixels to NSColor #1 and the whit开发者_Go百科e pixels to NSColor #2.
I've spent some time playing with QC, but cannot seem to figure out how to put all of the pieces together.
About the only thing I have figured out is that I need to use the Image Filter template and I do see there is a Image Pixel patch that can get pixels from an image...however, I don't see a patch to set a pixel. It also seems possible the Pixellate patch might be necessary...although, I shouldn't have to worry about it producing an image with infinite dimensions since my source images will only be fixed size PNG images.
Take a look at the False Color
patch — it takes an image and remaps it with a pair of colors.
In fact, since the False Color
patch is just a wrapper around the Core Image filter with the same name (CIFalseColor
), you could do this without involving Quartz Composer at all --- just set up and apply a CIFilter
instance to your NSImage.
Edit — or write your own Core Image filter, starting with something like this:
kernel vec4 remapBasedOnRed(sampler image,__color colorForDark,__color colorForLight)
{
return mix(colorForDark,colorForLight,sample(image, samplerCoord(image)).r);
}
...which takes the brightness of the red channel of the input image (sample(image, samplerCoord(image)).r
), and uses it as a coefficient for linear interpolation between colorForDark
and colorForLight
.
精彩评论