开发者

iOS: Video as GL texture with alpha transparency

开发者 https://www.devze.com 2023-03-17 09:40 出处:网络
I\'m trying to figure out the best approach to display a video on a GL texture while preserving the transparency 开发者_JAVA百科of the alpha channel.

I'm trying to figure out the best approach to display a video on a GL texture while preserving the transparency 开发者_JAVA百科of the alpha channel.

Information about video as GL texture is here: Is it possible using video as texture for GL in iOS? and iOS4: how do I use video file as an OpenGL texture?.

Using ffmpeg to help with alpha transparency, but not app store friendly is here: iPhone: Display a semi-transparent video on top of a UIView?

The video source would be filmed in front of a green screen for chroma keying. The video could be untouched to leave the green screen or processed in a video editing suite and exported to Quicktime Animation or Apple Pro Res 4444 with Alpha.

There are multiple approaches that I think could potentially work, but I haven't found a full solution.

  1. Realtime threshold processing of the video looking for green to remove
  2. Figure out how to use the above mentioned Quicktime codecs to preserve the alpha channel
  3. Blending two videos together: 1) Main video with RGB 2) separate video with alpha mask

I would love to get your thoughts on the best approach for iOS and OpenGL ES 2.0

Thanks.


The easiest way to do chroma keying for simple blending of a movie and another scene would be to use the GPUImageChromaKeyBlendFilter from my GPUImage framework. You can supply the movie source as a GPUImageMovie, and then blend that with your background content. The chroma key filter allows you to specify a color, a proximity to that color, and a smoothness of blending to use in the replacement operation. All of this is GPU-accelerated via tuned shaders.

Images, movies, and the live cameras can be used as sources, but if you wish to render this with OpenGL ES content behind your movie, I'd recommend rendering your OpenGL ES content to a texture-backed FBO and passing that texture in via a GPUImageTextureInput.

You could possibly use this to output a texture containing your movie frames with the keyed color replaced by a constant color with a 0 alpha channel, as well. This texture could be extracted using a GPUImageTextureOutput for later use in your OpenGL ES scene.


Apple showed a sample app at WWDC in 2011 called ChromaKey that shows how to handle frames of video passed to an OpenGL texture, manipulated, and optionally written out to a video file.

(In a very performant way)

It's written to use a feed from the video camera, and uses a very crude chromakey algorithm.

As the other poster said, you'll probably want to skip the chromakey code and do the color knockout yourself beforehand.

It shouldn't be that hard to rewrite the Chromakey sample app to use a video file as input instead of a camera feed, and it's quite easy to disable the chormakey code.

You'd need to modify the setup on the video input to expect RGBA data instead of RGB or Y/UV. The sample app is set up to use RGB, but I've seen other example apps from Apple that use Y/UV instead.


Have a look at the free "APNG" app on the app store. It shows how an animated PNG (.apng) can be rendered directly to an iOS view. The key is that APNG supports an alpha channel in the file format, so you don't need to mess around with chroma tricks that will not really work for all your video content. This approach is more efficient that multiple layers or chroma tricks since another round of processing is not needed each time a texture is displayed in a loop.

If you want to have a look at a small example xcode project that displays an alpha channel animation on the side of a spinning cube with OpenGL ES2, it can be found at Load OpenGL textures with alpha channel on iOS. The example code shows a simple call to glTexImage2D() that uploads a texture to the graphics card once for each display link callback.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号