I am developing an iPhone app that resizes and merges images.
I want to select two photos of size 1600x1200 from photo library and then merge both into a single image and save that new image back to the photo library.
However, I can't get the right size for the merged image.
I take two image views of frame 320x480 and set the view's image to my imported images. After manipulating the images (zooming, cropping, rotating), I then save the image to album. When I check the image si开发者_开发知识库ze it shows 600x800. How do I get the original size of 1600*1200?
I've been stuck on this problem from two weeks!
Thanks in advance.
The frame of the UIImageView has nothing to do with the size of the image it displays. If you display a 1200x1600 pixel in a 75x75 imageView the image size in memory is still 1200x1600. Somewhere in your processing of the image you are resetting its size.
You need to resize the images programmatically behind the scenes and ignore how they are displayed. For highest fidelity, I suggest preforming all processing on the image at full size and then resizing only the final result. For speed and low memory use, resize smaller first, process and then resize again as needed.
I use Trevor Harmon's UIImage+Resize to resize images.
His core method looks like this:
- (UIImage *)resizedImage:(CGSize)newSize
transform:(CGAffineTransform)transform
drawTransposed:(BOOL)transpose
interpolationQuality:(CGInterpolationQuality)quality
{
CGRect newRect = CGRectIntegral(CGRectMake(0, 0, newSize.width, newSize.height));
CGRect transposedRect = CGRectMake(0, 0, newRect.size.height, newRect.size.width);
CGImageRef imageRef = self.CGImage;
// Build a context that's the same dimensions as the new size
CGContextRef bitmap = CGBitmapContextCreate(NULL,
newRect.size.width,
newRect.size.height,
CGImageGetBitsPerComponent(imageRef),
0,
CGImageGetColorSpace(imageRef),
CGImageGetBitmapInfo(imageRef));
// Rotate and/or flip the image if required by its orientation
CGContextConcatCTM(bitmap, transform);
// Set the quality level to use when rescaling
CGContextSetInterpolationQuality(bitmap, quality);
// Draw into the context; this scales the image
CGContextDrawImage(bitmap, transpose ? transposedRect : newRect, imageRef);
// Get the resized image from the context and a UIImage
CGImageRef newImageRef = CGBitmapContextCreateImage(bitmap);
UIImage *newImage = [UIImage imageWithCGImage:newImageRef];
// Clean up
CGContextRelease(bitmap);
CGImageRelease(newImageRef);
return newImage;
}
Harmon saved me dozens of man hours trying to get resizing done correctly.
Solved as follows.
UIView *bgView = [[UIView alloc] initwithFrame:CGRectMake(0, 0, 1600, 1200)];
UIGraphicsBeginImageContext(tempView.bounds.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(viewImage, self, nil, nil);
Thanks for all your support to solve the issue
精彩评论