开发者

ios core audio: modifying aurioTouch to save pcm data

开发者 https://www.devze.com 2023-03-08 12:54 出处:网络
As an exercise, I\'m trying to modify aurioTouch so that it saves the first 60 seconds of PCM that come in through the mic.I\'m avoiding higher-level libs because I want to build some low-latency real

As an exercise, I'm trying to modify aurioTouch so that it saves the first 60 seconds of PCM that come in through the mic. I'm avoiding higher-level libs because I want to build some low-latency real-time processing on top of this. I did this by simply creating a big saveBuffer, and then simply appending the data_ptr[2] value stored in drawBuffers[] for each of the "inNumberFrames" per call to PerformThru...then, after 60 seconds has elapsed, I dump the buffer to disk in one shot.

I tried this code by feeding in a uniform click. The problem is that when开发者_如何学JAVA I visualize this saveBuffer data in gnuplot, I get peaks at non-uniform times, off by 30-40% from the steady click, meaning some peaks are close together while others are far apart. I can see the input click .wav and that it is very even, but the saveBuffer plot has bizarre peaks. It makes me wonder if I am saving the pcm data correctly? Perhaps I'm somehow taking too long and losing data as a result?

The changes in PerformThru() I have:

{ // allocate buffer static int *saveBuffer = ( int * ) malloc( 10000000 * sizeof( int ) ); . . .

    SInt8 *data_ptr = (SInt8 *)(ioData->mBuffers[0].mData);

    for (i=0; i<inNumberFrames; i++)
    {
        if ((i+drawBufferIdx) >= drawBufferLen)
        {
            cycleOscilloscopeLines();
            drawBufferIdx = -i;
        }
        drawBuffers[0][i + drawBufferIdx] = data_ptr[2];

        // XXXX I added this line
        if ( saveBuffer ) { saveBuffer[ saveBufferIdx++ ] = ( data_ptr[ 2 ] ); }


        data_ptr += 4;
    }

    // XXX - I added this block:  dump saveBuffer after 60 seconds
    if ( saveBuffer && ( CAHostTimeBase::HostDeltaToNanos( initialHostTime, inTimeStamp->mHostTime ) / 1000000000 ) > 60 )
    {
        std::ofstream bufferOut;
        bufferOut.open( "pcmBuffer.txt" );
        for ( UInt64 i = 0; i < saveBufferIdx; i++ )
        {
            bufferOut << saveBuffer[ i ] << std::endl;
        }
        bufferOut.close();
        free( saveBuffer );
        saveBuffer = 0;
    }
    drawBufferIdx += inNumberFrames;
}


Use CAMediaCurrentTime ( media current / current media -- I always forget which way round )

Gosh this source code is butt ugly, whoever wrote it at Apple has a lot to learn in terms of writing readable code. The fact that it is actually used as a public sample is just a joke.

it looks like you are doing everything right. Why don't you try disabling everything else and ONLY feeding the samples into your buffer?

I have done this when I was writing a pitch detector and it worked fine. once I had 30 seconds of samples I just printed everything onto the console much as you are saving to file.

I really think there is much more benefit in coding a visualiser from scratch. AurioTouch is a mess, and it will take longer to figure out how it works then it would take to actually build one.

0

精彩评论

暂无评论...
验证码 换一张
取 消