In my efforts to improve my apps, I need to check the width and height of a UIImage object.
The problem is the inconsistent data that I am getting from .size.width (and height)property.
If an image is landscape, then the image.size.width returns its width fine. But when the image is portrait, the image.size.width returns its height instead. I find this funny.
I could easily swap this value to correct it, but I want to know what's wrong as well. Is this a bug or am I doing it wrong?
Here is how I am retrieving the image properties:
float w = bigImage.size.width;
float h = bigImage.size.height;
NSLog(@"%.2f, %.2f", w,h);
I've tested this both in simulator and in my iPhone4, an开发者_如何学Cd both give the same result.
Any ideas why? Thanks
I just figured it out. the .size.width were indeed returning the correct values. But it was an earlier method that somehow rotated the image to landscape upon saving to sandbox. That method was written to save the image to the app sandbox raw direct from the camera/photo albums. I added a parsing method to preprocess the full res image before saving to the sandbox, and now it works.
精彩评论