开发者

Trim audio with iOS

开发者 https://www.devze.com 2023-01-26 18:12 出处:网络
I want to implement a feature that lets the user trim an audio file (.caf) w开发者_开发技巧hich he perviously recorded. The recording part already works, but how can i add a trimming feature similar t

I want to implement a feature that lets the user trim an audio file (.caf) w开发者_开发技巧hich he perviously recorded. The recording part already works, but how can i add a trimming feature similar to the one in the Voicememos app. Is there an api for the audio trimmer apple uses? Any help would be great...


How about using the AVFoundation? Import the audio file into an AVAsset (composition etc), then you can export it - setting preferred time + duration - to a file.

I wrote a stock function awhile ago that exports an asset to a file, you can also specify an audiomix. As below it exports all of the file, but you could add a NSTimeRange to exporter.timeRange and there you go. I haven't tested that though, but should work(?). Another alternative could be to adjust time ranges when creating the AVAsset + tracks. Of course the exporter only handles m4a (AAC). Sorry if this wasn't what you wanted.

-(void)exportAsset:(AVAsset*)asset toFile:(NSString*)filename overwrite:(BOOL)overwrite withMix:(AVAudioMix*)mix {
//NSArray* availablePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:asset];

AVAssetExportSession* exporter = [AVAssetExportSession exportSessionWithAsset:asset presetName:AVAssetExportPresetAppleM4A];

if (exporter == nil) {
    DLog(@"Failed creating exporter!");
    return;
}

DLog(@"Created exporter! %@", exporter);

// Set output file type
DLog(@"Supported file types: %@", exporter.supportedFileTypes);
for (NSString* filetype in exporter.supportedFileTypes) {
    if ([filetype isEqualToString:AVFileTypeAppleM4A]) {
        exporter.outputFileType = AVFileTypeAppleM4A;
        break;
    }
}
if (exporter.outputFileType == nil) {
    DLog(@"Needed output file type not found? (%@)", AVFileTypeAppleM4A);
    return;
}

// Set outputURL
NSArray* paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString* parentDir = [NSString stringWithFormat:@"%@/", [paths objectAtIndex:0]];

NSString* outPath = [NSString stringWithFormat:@"%@%@", parentDir, filename];

NSFileManager* manager = [NSFileManager defaultManager];
if ([manager fileExistsAtPath:outPath]) {
    DLog(@"%@ already exists!", outPath);
    if (!overwrite) {
        DLog(@"Not overwriting, uh oh!");
        return;
    }
    else {
        // Overwrite
        DLog(@"Overwrite! (delete first)");
        NSError* error = nil;
        if (![manager removeItemAtPath:outPath error:&error]) {
            DLog(@"Failed removing %@, error: %@", outPath, error.description);
            return;
        }
        else {
            DLog(@"Removed %@", outPath);
        }
    }
}

NSURL* const outUrl = [NSURL fileURLWithPath:outPath];
exporter.outputURL = outUrl;
// Specify a time range in case only part of file should be exported
//exporter.timeRange =

if (mix != nil)
    exporter.audioMix = mix; // important

DLog(@"Starting export! (%@)", exporter.outputURL);
[exporter exportAsynchronouslyWithCompletionHandler:^(void) {
    // Export ended for some reason. Check in status
    NSString* message;
    switch (exporter.status) {
        case AVAssetExportSessionStatusFailed:
            message = [NSString stringWithFormat:@"Export failed. Error: %@", exporter.error.description];
            DLog(@"%@", message);
            [self showAlert:message];
            break;
        case AVAssetExportSessionStatusCompleted: {
            /*if (playfileWhenExportFinished) {
             DLog(@"playfileWhenExportFinished!");
             [self playfileAfterExport:exporter.outputURL];
             playfileWhenExportFinished = NO;
             }*/
            message = [NSString stringWithFormat:@"Export completed: %@", filename];
            DLog(@"%@", message);
            [self showAlert:message];
            break;
        }
        case AVAssetExportSessionStatusCancelled:
            message = [NSString stringWithFormat:@"Export cancelled!"];
            DLog(@"%@", message);
            [self showAlert:message];
            break;
        default:
            DLog(@"Export unhandled status: %d", exporter.status);
            break;
    }       
}];
}


The above answer of @Jonny is correct. Here's I'm adding the use of AudioMixer to add the Fade-in effect while audio trimming.

Output: Audio asset trimmed to 20 seconds with a 10 second fade in. The trim being set up in the code snippet takes place at the 30 second mark of the asset and therefore the track duration should be at least 50 seconds.

- (BOOL)exportAssettoFilePath:(NSString *)filePath {


NSString *inputFilePath = <inputFilePath>;

NSURL *videoToTrimURL = [NSURL fileURLWithPath:inputFilePath];
AVAsset *avAsset = [AVAsset assetWithURL:videoToTrimURL];

// we need the audio asset to be at least 50 seconds long for this snippet
CMTime assetTime = [avAsset duration];
Float64 duration = CMTimeGetSeconds(assetTime);
if (duration < 50.0) return NO;

// get the first audio track
NSArray *tracks = [avAsset tracksWithMediaType:AVMediaTypeAudio];
if ([tracks count] == 0) return NO;

AVAssetTrack *track = [tracks objectAtIndex:0];

// create the export session
// no need for a retain here, the session will be retained by the
// completion handler since it is referenced there
AVAssetExportSession *exportSession = [AVAssetExportSession
                                       exportSessionWithAsset:avAsset
                                       presetName:AVAssetExportPresetAppleM4A];
if (nil == exportSession) return NO;

// create trim time range - 20 seconds starting from 30 seconds into the asset
CMTime startTime = CMTimeMake(30, 1);
CMTime stopTime = CMTimeMake(50, 1);
CMTimeRange exportTimeRange = CMTimeRangeFromTimeToTime(startTime, stopTime);

// create fade in time range - 10 seconds starting at the beginning of trimmed asset
CMTime startFadeInTime = startTime;
CMTime endFadeInTime = CMTimeMake(40, 1);
CMTimeRange fadeInTimeRange = CMTimeRangeFromTimeToTime(startFadeInTime,
                                                        endFadeInTime);

// setup audio mix
AVMutableAudioMix *exportAudioMix = [AVMutableAudioMix audioMix];
AVMutableAudioMixInputParameters *exportAudioMixInputParameters =
[AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:track];

[exportAudioMixInputParameters setVolumeRampFromStartVolume:0.0 toEndVolume:1.0
                                                  timeRange:fadeInTimeRange];
exportAudioMix.inputParameters = [NSArray
                                  arrayWithObject:exportAudioMixInputParameters];

// configure export session  output with all our parameters
exportSession.outputURL = [NSURL fileURLWithPath:filePath]; // output path
exportSession.outputFileType = AVFileTypeAppleM4A; // output file type
exportSession.timeRange = exportTimeRange; // trim time range
//exportSession.audioMix = exportAudioMix; // fade in audio mix

// perform the export
[exportSession exportAsynchronouslyWithCompletionHandler:^{

    if (AVAssetExportSessionStatusCompleted == exportSession.status) {
        NSLog(@"AVAssetExportSessionStatusCompleted");
    } else if (AVAssetExportSessionStatusFailed == exportSession.status) {
        // a failure may happen because of an event out of your control
        // for example, an interruption like a phone call comming in
        // make sure and handle this case appropriately
        NSLog(@"AVAssetExportSessionStatusFailed");
    } else {
        NSLog(@"Export Session Status: %ld", (long)exportSession.status);
    }
}];

return YES;}

Thanks

For More Details :

https://developer.apple.com/library/ios/qa/qa1730/_index.html

0

精彩评论

暂无评论...
验证码 换一张
取 消