I try to save the sample buffer instead of an UIImage to an array, to convert it later on. This to speed up the image capturing and maybe not get memory warnings. I just can't figure out how to save it to the array and then use it again to call [self imageFromSampleBuffer:sampleBuffer]. I tried something like this, but how do I convert the data back to a CMSampleBufferRef object?
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection {
// Create a UIImage from the sample buffer data
// UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
// [arrCaptures addObject:image];
[arrImageBuffer addObject:[NSData dataWithBytes:sampleBuffer length:si开发者_运维知识库zeof(sampleBuffer)] ];}
Why not just use a CFArray and directly put the CMSampleBufferRef objects in there?
You can use CFArray but you should remember that the CMSampleBufferRef is not retained, and that the "captureOutput:didOutputSampleBuffer:fromConnection:" uses a memory pool - and when you don't release the memory it stops sending new samples (that is why you get only 13 samples) as you can read in:captureOutput:didOutputSampleBuffer:fromConnection:
精彩评论