开发者

AudioRecord and AudioTrack latency

开发者 https://www.devze.com 2023-02-19 04:50 出处:网络
I\'m trying to develop an aplication like iRig for android, so the first step is to capture the mic input and play it at the same time.

I'm trying to develop an aplication like iRig for android, so the first step is to capture the mic input and play it at the same time.

I have it, but the problem is that i get some latency that makes this unusable, and if I start processing the buffer i'm afraid it will get totally unusable.

I use audiorecord and audiotrack like this:

    new Thread(new Runnable() {
        public void run() {
            while(mRunning){开发者_StackOverflow
                mRecorder.read(mBuffer, 0, mBufferSize);
                //Todo: Apply filters here into the buffer and then play it modified
                mPlayer.write(mBuffer, 0, mBufferSize);         
                //Log.v("MY AMP","ARA");
            }

And the inicialization this way:

// ==================== INITIALIZE ========================= //
public void initialize(){

    mBufferSize = AudioRecord.getMinBufferSize(mHz, 
                AudioFormat.CHANNEL_CONFIGURATION_MONO, 
                AudioFormat.ENCODING_PCM_16BIT);

    mBufferSize2 = AudioTrack.getMinBufferSize(mHz, 
                AudioFormat.CHANNEL_CONFIGURATION_MONO, 
                AudioFormat.ENCODING_PCM_16BIT);

    mBuffer = new byte[mBufferSize];

    Log.v("MY AMP","Buffer size:" + mBufferSize);

    mRecorder = new AudioRecord(MediaRecorder.AudioSource.MIC, 
                mHz,
                AudioFormat.CHANNEL_CONFIGURATION_MONO,
                AudioFormat.ENCODING_PCM_16BIT, 
                mBufferSize);

    mPlayer = new AudioTrack(AudioManager.STREAM_MUSIC,
                mHz,
                AudioFormat.CHANNEL_CONFIGURATION_MONO,
                AudioFormat.ENCODING_PCM_16BIT,
                mBufferSize2, 
                AudioTrack.MODE_STREAM);    

}

do you know how to get a faster response? Thanks!


Android's AudioTrack\AudioRecord classes have high latency due to minimum buffer sizes. The reason for those buffer sizes is to minimize drops when GC's occur according to Google (which is a wrong decision in my opinion, you can optimize your own memory management).

What you want to do is use OpenSL, which is available from 2.3. It contains native APIs for streaming audio. Here's some docs: http://mobilepearls.com/labs/native-android-api/opensles/index.html


Just a thought, but shouldn't you be reading < mBufferSize


As mSparks pointed out, streaming should be made using smaller read size: you don't need to read the full buffer to stream data!

int read = mRecorder.read(mBuffer, 0, 256); /* Or any other magic number */
if (read>0) {
    mPlayer.write(mBuffer, 0, read);  
}

This will reduce drastically your latency. If mHz is 44100 and your are in MONO configuration with 256 your latency will be no less then 1000 * 256/44100 milliseconds = ~5.8 ms. 256/44100 is the conversion from samples to seconds, so multiplying by 1000 gives you milliseconds. The problems is internal implementation of the player. You don't have control about that from java. Hope this helps someone :)


My first instict was to suggest initting AudioTrack into static mode rather than streaming mode, since static mode has notably smaller latency. However, Static Mode is more appropriate for short sounds that fit entirely in memory rather than a sound you are capturing from elsewhere. But just as a wild guess, what if you set AudioTrack to static mode and feed it discrete chunks of your input audio?

If you want tighter control over audio, I'd recommend taking a look at OpenSL ES for Android. The learning curve will be a bit steeper, but you get much more fine-grained control and lower latency.

0

精彩评论

暂无评论...
验证码 换一张
取 消