I'm grabbing frame images from the iPhone's camera at a rate of 25fps using a resolution of 192 x 144 and a 420v, BGRA format.
I'm converting the CVImageBufferRef
s into UIImage
s and then calling UIImageJPEGRepresenation(image, compressionQuality)
to get a compressed JPEG versio开发者_如何转开发n of the image.
Using the Time Profiler in Instruments, I can see that 75% of my CPU time is spent getting the JPEG representation of the image, causing slow down with the other operations I need to accomplish in the app.
It fluctuates a little, spending less time if I set the compression to 1.0 (i.e., no compression) and spending more if I set it to 0.0 (i.e. full compression).
Is there a more efficient way to get a JPEG representation of an image from the iPhone's camera?
Can I get a JPEG representation without converting the CVImageBufferRef
to a UIImage
(and therefore cutting out a rather expensive Core Graphics drawing operation)?
Is the concern the responsiveness of the application, or the actual compression time required? What about wrapping the JPEG code in a block and putting it on a background queue?
精彩评论