开发者

Are the Core Image filters in iOS 5.0 fast enough for realtime video processing?

开发者 https://www.devze.com 2023-03-18 02:12 出处:网络
Now that Apple has ported the Core Image framework over to iOS 5.0, I\'m wondering: is Core Image is fast enough to apply live filters and effects to 开发者_StackOverflow社区camera video?

Now that Apple has ported the Core Image framework over to iOS 5.0, I'm wondering: is Core Image is fast enough to apply live filters and effects to 开发者_StackOverflow社区camera video?

Also, what would be a good starting point to learn the Core Image framework for iOS 5.0?


Now that Core Image has been out on iOS for a while, we can talk about some hard performance numbers. I created a benchmark application as part of the testing for my GPUImage framework, and profiled the performance of raw CPU-based filters, Core Image filters, and GPUImage filters with live video feeds. The following were the times (in milliseconds) each took to apply a single gamma filter on a 640x480 video frame from the iPhone's camera (for two different hardware models running two different OS versions):

             iPhone 4 (iOS 5)   | iPhone 4S (iOS 6)
------------------------------------------------
CPU          458 ms (2.2 FPS)     183 ms (5.5 FPS)
Core Image   106 ms (6.7 FPS)     8.2 ms (122 FPS)
GPUImage     2.5 ms (400 FPS)     1.8 ms (555 FPS)

For Core Image, this translates into a maximum of 9.4 FPS for a simple gamma filter on iPhone 4, but well over 60 FPS for the same on an iPhone 4S. This is about the simplest Core Image filter case you can set up, so performance will certainly vary with more complex operations. This would seem to indicate that Core Image cannot do live processing fast enough to match the iPhone's camera rate on the iPhone 4 running iOS 5, but as of iOS 6, it processes video more than fast enough to do live filtering on iPhone 4S and above.

The source for these benchmarks can be found in my GitHub repository), if you wish to see where I got these numbers from.

I've updated this answer from my original, which was too critical of Core Image's performance. The sepia tone filter I was using as a basis of comparison was not performing the same operation as my own, so it was a poor benchmark. The performance of Core Image filters also improved significantly in iOS 6, which helped make them more than fast enough to process live video on iPhone 4S and up. Also, I've since found several cases, like large-radius blurs, where Core Image significantly outperforms my GPUImage framework.

Previous answer, for posterity:

As with any performance-related question, the answer will depend on the complexity of your filters, the image size being filtered, and the performance characteristics of the device you're running on.

Because Core Image has been available for a while on the Mac, I can point you to the Core Image Programming Guide as a resource for learning the framework. I can't comment on the iOS-specific elements, given the NDA, but I highly recommend watching the video for WWDC 2011 Session 422 - Using Core Image on iOS and Mac OS X.

Core Image (mostly) uses the GPU for image processing, so you could look at how fast OpenGL ES 2.0 shaders handle image processing on existing devices. I did some work in this area recently, and found that the iPhone 4 could do 60 FPS processing using a simple shader on realtime video being fed in at a 480 x 320. You could download my sample application there and attempt to customize the shader and / or video input size to determine if your particular device could handle this processing at a decent framerate. Core Image may add a little overhead, but it also has some clever optimizations for how it organizes filter chains.

The slowest compatible devices out there would be the iPhone 3G S and the 3rd generation iPod touch, but they're not that much slower than the iPhone 4. The iPad 2 blows them all away with its massive fragment processing power.


IMHO, Core Image is always your first option since iOS6.0. There are some good features you r gonna like, such as:

  1. Core Image input params support glTexture2d;
  2. Core Image output params support glTexture2d;
  3. You can choose to use CI exclusively with GPU; init a CIContext eg. _CIglContext = [CIContext contextWithEAGLContext:_glContext options:opts];
  4. Many filters are now available to iOS platform, 93 filters or so?
  5. You can choose to process buffer exclusively with CPU when your application are temporarily running in the background, which though is not recommended.

These are all mentioned in WWDC2012 session video, Core Image part. Have a look, maybe you will find your solution there.


It will be almost real-time in iPhone 4 and forward. iPhone 3GS will be a little choppy. iPhone 3 and before is not recommended.

0

精彩评论

暂无评论...
验证码 换一张
取 消