I'm two days with the following problem and I can not solve, I got the following code right here
+ (NSArray *)getRGB:(UIImage*)image atX:(int)xx andY:(int)yy count:(int)count
{
NSMutableArray *result = [NSMutableArray arrayWithCapacity:count];
// First get the image into your data buffer
CGImageRef imageRef = [image CGImage];
NSUInteger width = CGImageGetWidth(imageRef);
NSUInteger height = CGImageGetHeight(imageRef);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
unsigned char *rawData = malloc(height * width * 4);
NSUInteger bytesPerPixel = 4;
NSUInteger bytesPerRow = bytesPerPixel * width;
NSUInteger bitsPerComponent = 8;
CGContextRef context = CGBitmapContextCreate(rawData, width, height,
bitsPerComponent, bytesPerRow, colorSpace,
kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
CGColorSpaceRelease(colorSpace);
CGContextDrawImage(context, CGRectMake(0, 0, width, height), imageRef);
CGContextRelease(context);
// Now your rawData contains the image data in the RGBA8888 pixel format.
int byteIndex = (byt开发者_如何学CesPerRow * yy) + xx * bytesPerPixel;
for (int ii = 0 ; ii < count ; ++ii)
{
CGFloat red = (rawData[byteIndex] * 1.0) / 255.0;
CGFloat green = (rawData[byteIndex + 1] * 1.0) / 255.0;
CGFloat blue = (rawData[byteIndex + 2] * 1.0) / 255.0;
CGFloat alpha = (rawData[byteIndex + 3] * 1.0) / 255.0;
byteIndex += 4;
UIColor *acolor = [UIColor colorWithRed:red green:green blue:blue alpha:alpha];
[result addObject:acolor];
}
free(rawData);
return result;
}
-(void)getTouchColor:(UITapGestureRecognizer *) touch
{
UIAlertView * alert =[[[UIAlertView alloc] initWithTitle:@"entrei" message:@"tap" delegate:nil cancelButtonTitle:@"cancel" otherButtonTitles:nil]autorelease];
[alert show];
//NSArray *Mycolors=[[[NSArray alloc] init]retain];
CGPoint point = [touch locationInView:MyImg];
//MyColors=[[NSArray alloc] init];
//GetPhoto is the name of class
NSArray *myColors=[GetPhoto getRGB:MyImg.image AtX:point.x AndY:point.y count:3];
//NSLog(@"cliquei");
}
I'm trying to fill the NSArray called MyColors getRGBAsFromImage as a result of, but I get the Warning NSArray May Not Respond to
I am using the following call
I wonder where I am wrong!
Thank you very much
//* sorry for my English, complain to Larry Page *//
You seem to be calling the method on an instance of NSArray (I assume the actual calling code looks differently?). Since NSArray doesn't recognize the selector, the warning is given. Of which class is this a method? For the sake of this answer, let's call it BrunosClass. Then the call should be:
//remove the other line you showed.
NSArray *myColors = [BrunosClass /* <--subst with real name of class */
getRGBAsFromImage: myImg.image atX: point.x andY: point.y count: 3];
Although I know that a solution should involve fixing an existing problem, there is a class that I made called ANImageBitmapRep
. It allows easy access to an image's pixel data. It's on GitHub here. With ANImageBitmapRep, it's easy to get pixels from an image like this:
// replace myImage with your UIImage
ANImageBitmapRep * ibr = [[ANImageBitmapRep alloc] initWithImage:myImage];
BMPixel pixel = [ibr getPixelAtPoint:BMPointMake(0, 0)];
NSLog(@"Red: %f green: %f blue: %f alpha: %f", pixel.red, pixel.green, pixel.blue, pixel.alpha);
[ibr release];
The code that you have seems like it would be highly inefficient for getting multiple pixels on an image, since it re-allocates the entire context every time a pixel is required.
精彩评论