开发者

Play multiple Audio Files with AVPlayer

开发者 https://www.devze.com 2023-03-10 00:54 出处:网络
I\'m trying to play multiple sounds at the same time. The approach initially I\'ve taken was to create several players , but it seems wrong one.

I'm trying to play multiple sounds at the same time.

The approach initially I've taken was to create several players , but it seems wrong one.

What's the best way to play several audio files at the same time.

Is it through making them AVAssets, but in this case how would I stop and play them whenever I want.

Really appreciate your help.

The reason I 开发者_StackOverflowneed AVPlayer is to fetch sounds from the iPod Library.

I finally got an answer from TechSupport of Apple Dev Team and it seems I'm on the right track when I decided to use several AVPlayer.


For every sound you want to make make a new AVPlayer.

NSURL *url = [NSURL URLWithString:pathToYourFile];
AVPlayer *audioPlayer = [[AVPlayer alloc] initWithURL:url];
[audioPlayer play];


I have never answered a question here and I don't know in anyone is still waiting for an answer to this but heres my take... Try this and it should work, I am currently using it to play 12 plus simultaneous audio samples. I apologize if I am doing something newbish..

You press a button and you run this code...

But first you need to:

  1. Need to import AVFoundation to project and #import into .h file as well then we can play sound with this.
  2. Need to put "AVAudioPlayer *myAudio;" without quotation marks of course somewhere on top (usually on top of viewDidLoad).

Then just...

-(IBAction)playButtonPressed:(id)sender {

    NSURL *yourMusicFile;
    yourMusicFile = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:@"your_Song_Name" ofType:@"mp3"]];

    myAudio = [[AVAudioPlayer alloc] initWithContentsOfURL:musicFile error:nil];
    [myAudio play];
    NSLog(@"Button -playButtonPressed- has been pressed!");
}


Well, my solution comes out of experience. I can quickly cook up a project if needed. But also, it requires the use of an MoMu API at Stanford. It involves creating WvIn and WvOut objects for reading the files. The audio samples of these objects simply need to be fed to the output buffer to play the files simultaneously. Although the API uses AVFoundation, there is no explicit use of AVFoundation in this project.


Basically what everyone else is saying, make sure you create an audio player for each source.

What you also must do is KEEP A STRONG REFERENCE TO ALL THE PLAYER OBJECTS.

If you don't do this they get released and playback stops.

I had this issue when I was only keeping a reference to the last source I wanted to play back. This meant that I would only hear the last data source and I thought the issue was something to do with the simultaneous playback configuration - but in reality it was just the other players were being dealloc'd and thus playback would stop.

 try? AVAudioSession.sharedInstance().setCategory(AVAudioSession.Category.playback, options: .mixWithOthers)
            try? AVAudioSession.sharedInstance().setActive(true)

I also used the above snippet before creating and playing back my audio sources.


We can make mixComposition with AVMutableComposition with multiple audio asset, then play on AVPlayer. like as:

var player: AVPlayer?

func addMultipleTrack(videoURL1: URL, videoURL2: URL, videoURL3: URL) {
    
    let asset1 = AVURLAsset(url: videoURL1)
    let asset2 = AVURLAsset(url: videoURL2)
    let asset3 = AVURLAsset(url: videoURL3)
   
    
    let composition = AVMutableComposition()
    var timeRange = CMTimeRange(start: .zero, duration: asset1.duration)
    
    if let audioAssetTrack1 = asset1.tracks(withMediaType: .audio).first,
       let compositionAudioTrack1 = composition.addMutableTrack(withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid) {
        
        do {
            try compositionAudioTrack1.insertTimeRange(timeRange, of: audioAssetTrack1, at: .zero)
        }catch {
            print(error)
            return
        }
    }
    
    
    timeRange = CMTimeRange(start: .zero, duration: asset2.duration)
    
    if let audioAssetTrack2 = asset2.tracks(withMediaType: .audio).first,
       let compositionAudioTrack2 = composition.addMutableTrack(withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid) {
        
        do {
            try compositionAudioTrack2.insertTimeRange(timeRange, of: audioAssetTrack2, at: .zero)
        }catch {
            print(error)
            return
        }
    }
    
    
    timeRange = CMTimeRange(start: .zero, duration: asset3.duration)
    if let audioAssetTrack3 = asset3.tracks(withMediaType: .audio).first,
       let compositionAudioTrack3 = composition.addMutableTrack(withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid) {
        
        do {
            try compositionAudioTrack3.insertTimeRange(timeRange, of: audioAssetTrack3, at: .zero)
        }catch {
            print(error)
            return
        }
        
    }
    
    let item = AVPlayerItem(asset: composition)
    
    player = AVPlayer(playerItem: item)
    
    let layer = AVPlayerLayer(player: player!)
    
    tempView.layer.addSublayer(layer) // it just hold layer but not need to show ,So you can hide in your won way
    player?.play()
    
}
0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号