AudioRecord and AudioTrack latency

Jordi Puigdellívol picture Jordi Puigdellívol · Mar 21, 2011 · Viewed 9.7k times · Source

I'm trying to develop an aplication like iRig for android, so the first step is to capture the mic input and play it at the same time.

I have it, but the problem is that i get some latency that makes this unusable, and if I start processing the buffer i'm afraid it will get totally unusable.

I use audiorecord and audiotrack like this:

    new Thread(new Runnable() {
        public void run() {
            while(mRunning){
                mRecorder.read(mBuffer, 0, mBufferSize);
                //Todo: Apply filters here into the buffer and then play it modified
                mPlayer.write(mBuffer, 0, mBufferSize);         
                //Log.v("MY AMP","ARA");
            }

And the inicialization this way:

// ==================== INITIALIZE ========================= //
public void initialize(){

    mBufferSize = AudioRecord.getMinBufferSize(mHz, 
                AudioFormat.CHANNEL_CONFIGURATION_MONO, 
                AudioFormat.ENCODING_PCM_16BIT);

    mBufferSize2 = AudioTrack.getMinBufferSize(mHz, 
                AudioFormat.CHANNEL_CONFIGURATION_MONO, 
                AudioFormat.ENCODING_PCM_16BIT);

    mBuffer = new byte[mBufferSize];

    Log.v("MY AMP","Buffer size:" + mBufferSize);

    mRecorder = new AudioRecord(MediaRecorder.AudioSource.MIC, 
                mHz,
                AudioFormat.CHANNEL_CONFIGURATION_MONO,
                AudioFormat.ENCODING_PCM_16BIT, 
                mBufferSize);

    mPlayer = new AudioTrack(AudioManager.STREAM_MUSIC,
                mHz,
                AudioFormat.CHANNEL_CONFIGURATION_MONO,
                AudioFormat.ENCODING_PCM_16BIT,
                mBufferSize2, 
                AudioTrack.MODE_STREAM);    

}

do you know how to get a faster response? Thanks!

Answer

SirKnigget picture SirKnigget · Aug 2, 2011

Android's AudioTrack\AudioRecord classes have high latency due to minimum buffer sizes. The reason for those buffer sizes is to minimize drops when GC's occur according to Google (which is a wrong decision in my opinion, you can optimize your own memory management).

What you want to do is use OpenSL, which is available from 2.3. It contains native APIs for streaming audio. Here's some docs: http://mobilepearls.com/labs/native-android-api/opensles/index.html