I have QTMovie open in QTKit.
I need to get each frame of this video in YV12 format (kYUV420PixelFormat), in real time (ie. I'm passing it to foreign code which only accepts YV12 and needs to play the video in real time).
It seems The Way It Should Be Done is to call [movie frameImageAtTime: [movie currentTime] withAttributes: error: ] for current frame, and then [movie stepForward] to get to the next frame, and so on until I get all the frames. However, as much开发者_运维问答 as I look into it, I can't find a way to make QTKit give me the data in YV12 format, or any other YUV format. frameImageAtTime: call can convert it to:
- NSImage (but NSImage can't store planar YUV),
- CGImage (same thing),
- CIImage (same thing),
- CVPixelBuffer (this one can store YUV, but there seems to be no way to configure the call to get YUV from it. By default it returns ARGB32 data)
- OpenGL texture (this probably can be configured as well, but I don't need this data in OpenGL, I need it in memory)
So it seems that the only way to use this supposedly new and optimized QTKit technology is to get ARGB data from it, convert each frame to YV12 with custom code, and hope it will still be fast enough for realtime. Or am I missing something?
In old QuickTime it was relatively easy to set up GWorld with kYUV420PixelFormat, have Movie render to it, and it just worked. But old QuickTime calls are legacy, deprecated calls, not to be used anymore...
What should I do to get YUV420 planar frames without the unnecessary conversions?
Based on this thread on one of the old apple mailing lists, I'd say this at least used to be impossible. I'm trying now to find out if it can be done with a lower level API.
http://lists.apple.com/archives/quicktime-api/2008/Nov/msg00049.html
On Nov 5, 2008, at 12:08 PM, Neil Clayton wrote:
I'd like to get a YUV frame out of a movie (the movie is encoded in YUV, the frames were originally k2vuyPixelFormat and the encoder format would have been compatible with that - e.g: H264 or AIC etc).
When I do this:
NSError *error = nil;
NSDictionary *dict = [NSDictionary dictionaryWithObjectsAndKeys:
QTMovieFrameImageTypeCVPixelBufferRef, QTMovieFrameImageType,
[NSNumber numberWithBool:YES], QTMovieFrameImageHighQuality,
nil];
CVPixelBufferRef buffer = [qtMovie frameImageAtTime:QTMakeTime(lastFrame, movie.timeScale)
withAttributes:dict error:&error];
The frame appears valid. It has a correct width and height. But it seems to be of type k32ARGBPixelFormat when I do:
OSType type = CVPixelBufferGetPixelFormatType(buffer);
Presuming I'm doing this the wrong way - what's the correct method for getting a frame of type k2vuyPixelFormat from a movie? Or if this isn't possible, what's the easiest way to perform a RGB- YUV conversion into a CVPixelBuffer of type k2vuyPixelFormat? I don't need speed here (it's a one off, one frame operation).
On Nov 7, 20008, Tim Monroe of QuickTime Engineering responds:
Currently there is no way to do what you want via frameImageAtTime. I would suggest filing an enhancement request.
精彩评论