I have recorded a video by placing a view over it with the cameraOverlayView option, while recording it disp开发者_开发技巧lays my view, but when i try to save it and watch, the view isnt appearing.
Can anyone help me in fixing this issue?
Thanks in advance.
I'm afraid it won't be as easy as that. You'll have to actually capture individual frames using the AVCaptureSession class. Then you can composite your overlaid view onto the images as you capture them, and then feed the composite to an AVCaptureDevice.
It's pretty involved. Here is some code for setting up the capture to get you started:
// Create and configure a capture session and start it running
- (void)setupCaptureSession { NSError *error = nil;
// Create the session
session = [[AVCaptureSession alloc] init]; // note we never release this...leak?
// Configure the session to produce lower resolution video frames, if your
// processing algorithm can cope. We'll specify medium quality for the
// chosen device.
session.sessionPreset = AVCaptureSessionPresetLow; // adjust this! AVCaptureSessionPresetLow
// Find a suitable AVCaptureDevice
AVCaptureDevice *device = [AVCaptureDevice
defaultDeviceWithMediaType:AVMediaTypeVideo];
// Create a device input with the device and add it to the session.
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (!input) {
// Handling the error appropriately.
NSLog(@"Yikes, null input");
}
[session addInput:input];
// Create a VideoDataOutput and add it to the session
AVCaptureVideoDataOutput *output = [[[AVCaptureVideoDataOutput alloc] init] autorelease];
output.alwaysDiscardsLateVideoFrames = YES; // cribbed this from somewhere -- seems related to our becoming unrepsonsive
[session addOutput:output];
if (!output) {
// Handling the error appropriately.
NSLog(@"ERROROROROR");
}
// Configure your output.
dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
// Specify the pixel format
// kCVPixelFormatType_32RGBA or kCVPixelFormatType_32BGRA
output.videoSettings =
[NSDictionary dictionaryWithObject:
[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
forKey:(id)kCVPixelBufferPixelFormatTypeKey];
// If you wish to cap the frame rate to a known value, such as 15 fps, set
// minFrameDuration.
output.minFrameDuration = CMTimeMake(1, VIDEO_CAPTURE_FRAMERATE); // WATCH THIS!
NSNotificationCenter *notify = [NSNotificationCenter defaultCenter];
[notify addObserver: self selector: @selector(onVideoError:) name: AVCaptureSessionRuntimeErrorNotification object: session];
[notify addObserver: self selector: @selector(onVideoInterrupted:) name: AVCaptureSessionWasInterruptedNotification object: session];
[notify addObserver: self selector: @selector(onVideoEnded:) name: AVCaptureSessionInterruptionEndedNotification object: session];
[notify addObserver: self selector: @selector(onVideoDidStopRunning:) name: AVCaptureSessionDidStopRunningNotification object: session];
[notify addObserver: self selector: @selector(onVideoStart:) name: AVCaptureSessionDidStartRunningNotification object: session];
}
精彩评论