I want to export a movie with A开发者_如何学CVAssetWriter
and can't figure out how to include video and audio tracks in sync. Exporting only video works fine, but when I add audio the resulting movie looks like this:
First I see the video (without audio), then the video freezes (showing the last image frame until the end) and after some seconds I hear the audio.
I tried some things with CMSampleBufferSetOutputPresentationTimeStamp
(subtracting the first CMSampleBufferGetPresentationTimeStamp
from the current) for the audio, but it all didn't work and I don't think it is the right direction, since video & audio in the source movie should be in sync anyway...
My setup in short: I create an AVAssetReader
and 2 AVAssetReaderTrackOutput
(one for video, one for audio) and add them to the AVAssetReader
, then I create an AVAssetWriter
and 2 AVAssetWriterInput
(video & audio) and add them to the AVAssetWriter
... I start it all up with:
[assetReader startReading];
[assetWriter startWriting];
[assetWriter startSessionAtSourceTime:kCMTimeZero];
Then I run 2 queues for doing the sample buffer stuff:
dispatch_queue_t queueVideo=dispatch_queue_create("assetVideoWriterQueue", NULL);
[assetWriterVideoInput requestMediaDataWhenReadyOnQueue:queueVideo usingBlock:^
{
while([assetWriterVideoInput isReadyForMoreMediaData])
{
CMSampleBufferRef sampleBuffer=[assetReaderVideoOutput copyNextSampleBuffer];
if(sampleBuffer)
{
[assetWriterVideoInput appendSampleBuffer:sampleBuffer];
CFRelease(sampleBuffer);
} else
{
[assetWriterVideoInput markAsFinished];
dispatch_release(queueVideo);
videoFinished=YES;
break;
}
}
}];
dispatch_queue_t queueAudio=dispatch_queue_create("assetAudioWriterQueue", NULL);
[assetWriterAudioInput requestMediaDataWhenReadyOnQueue:queueAudio usingBlock:^
{
while([assetWriterAudioInput isReadyForMoreMediaData])
{
CMSampleBufferRef sampleBuffer=[assetReaderAudioOutput copyNextSampleBuffer];
if(sampleBuffer)
{
[assetWriterAudioInput appendSampleBuffer:sampleBuffer];
CFRelease(sampleBuffer);
} else
{
[assetWriterAudioInput markAsFinished];
dispatch_release(queueAudio);
audioFinished=YES;
break;
}
}
}];
In the main loop I wait for both queues until they finish:
while(!videoFinished && !audioFinished)
{
sleep(1);
}
[assetWriter finishWriting];
Furthermore I try to save the resulting file in the library with the following code...
NSURL *url=[[NSURL alloc] initFileURLWithPath:path];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if([library videoAtPathIsCompatibleWithSavedPhotosAlbum:url])
{
[library writeVideoAtPathToSavedPhotosAlbum:url completionBlock:^(NSURL *assetURL, NSError *error)
{
if(error)
NSLog(@"error=%@",error.localizedDescription);
else
NSLog(@"completed...");
}];
} else
NSLog(@"error, video not saved...");
[library release];
[url release];
...but I get the error:
Video /Users/cb/Library/Application Support/iPhone Simulator/4.2/Applications/E9865BF9-D190-4912-9248-66768B1AB635/Documents/export.mp4 cannot be saved to the saved photos album: Error Domain=NSOSStatusErrorDomain Code=-12950 "Movie could not be played." UserInfo=0x5e4fb90 {NSLocalizedDescription=Movie could not be played.}
The code works without problems in another program. So something is wrong with the movie...?
-(void)mergeAudioVideo
{
NSString *videoOutputPath=[_documentsDirectory stringByAppendingPathComponent:@"dummy_video.mp4"];
NSString *outputFilePath = [_documentsDirectory stringByAppendingPathComponent:@"final_video.mp4"];
if ([[NSFileManager defaultManager]fileExistsAtPath:outputFilePath])
[[NSFileManager defaultManager]removeItemAtPath:outputFilePath error:nil];
NSURL *outputFileUrl = [NSURL fileURLWithPath:outputFilePath];
NSString *filePath = [_documentsDirectory stringByAppendingPathComponent:@"newFile.m4a"];
AVMutableComposition* mixComposition = [AVMutableComposition composition];
NSURL *audio_inputFileUrl = [NSURL fileURLWithPath:filePath];
NSURL *video_inputFileUrl = [NSURL fileURLWithPath:videoOutputPath];
CMTime nextClipStartTime = kCMTimeZero;
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:video_inputFileUrl options:nil];
CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero,videoAsset.duration);
AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[a_compositionVideoTrack insertTimeRange:video_timeRange ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:nextClipStartTime error:nil];
AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:audio_inputFileUrl options:nil];
CMTimeRange audio_timeRange = CMTimeRangeMake(kCMTimeZero, audioAsset.duration);
AVMutableCompositionTrack *b_compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[b_compositionAudioTrack insertTimeRange:audio_timeRange ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:nextClipStartTime error:nil];
AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetMediumQuality];
_assetExport.outputFileType = @"com.apple.quicktime-movie";
_assetExport.outputURL = outputFileUrl;
[_assetExport exportAsynchronouslyWithCompletionHandler:
^(void ) {
if (_assetExport.status == AVAssetExportSessionStatusCompleted) {
//Write Code Here to Continue
}
else {
//Write Fail Code here
}
}
];
}
You can use this code to merge audio and video.
It seams that assetWriterAudioInput ignores sample buffer time for audio writing. Do this way.
1) Write video track.
2) When done, mark it finished i.e. [videoWriterInput markAsFinished];
3) do [assetWriter startSessionAtSourceTime:timeRangeStart];
3) instantiate audio reader and start writing audio.
精彩评论