Realtime Audio with AVAudioEngine

Michael Dorner picture Michael Dorner · Jun 24, 2014 · Viewed 19.6k times · Source

Hej. I want to implement a realtime audio application with the new AVAudioEngine in Swift. Has someone experience with the new framework? How does real time applications work?

My first idea was to store the (processed) input data into a AVAudioPCMBuffer object and then let it play by an AVAudioPlayerNode as you can see in my demo class:

import AVFoundation

class AudioIO {
    var audioEngine: AVAudioEngine
    var audioInputNode : AVAudioInputNode
    var audioPlayerNode: AVAudioPlayerNode
    var audioMixerNode: AVAudioMixerNode
    var audioBuffer: AVAudioPCMBuffer

    init(){
        audioEngine = AVAudioEngine()
        audioPlayerNode = AVAudioPlayerNode()
        audioMixerNode = audioEngine.mainMixerNode

        let frameLength = UInt32(256)
        audioBuffer = AVAudioPCMBuffer(PCMFormat: audioPlayerNode.outputFormatForBus(0), frameCapacity: frameLength)
        audioBuffer.frameLength = frameLength

        audioInputNode = audioEngine.inputNode

        audioInputNode.installTapOnBus(0, bufferSize:frameLength, format: audioInputNode.outputFormatForBus(0), block: {(buffer, time) in
            let channels = UnsafeArray(start: buffer.floatChannelData, length: Int(buffer.format.channelCount))
            let floats = UnsafeArray(start: channels[0], length: Int(buffer.frameLength))

            for var i = 0; i < Int(self.audioBuffer.frameLength); i+=Int(self.audioMixerNode.outputFormatForBus(0).channelCount)
            {
                // doing my real time stuff
                self.audioBuffer.floatChannelData.memory[i] = floats[i];
            }
            })

        // setup audio engine
        audioEngine.attachNode(audioPlayerNode)
        audioEngine.connect(audioPlayerNode, to: audioMixerNode, format: audioPlayerNode.outputFormatForBus(0))
        audioEngine.startAndReturnError(nil)

        // play player and buffer
        audioPlayerNode.play()
        audioPlayerNode.scheduleBuffer(audioBuffer, atTime: nil, options: .Loops, completionHandler: nil)
    }
}

But this is far away from real time and not very efficient. Any ideas or experiences? And it does not matter, if you prefer Objective-C or Swift, I am grateful for all notes, remarks, comments, solutions, etc.

Answer

Jason McClinsey picture Jason McClinsey · Oct 28, 2014

I've been experimenting with AVAudioEngine in both Objective-C and Swift. In the Objective-C version of my engine, all audio processing is done purely in C (by caching the raw C sample pointers available through AVAudioPCMBuffer, and operating on the data with only C code). The performance is impressive. Out of curiosity, I ported this engine to Swift. With tasks like playing an audio file linearly, or generating tones via FM synthesis, the performance is quite good, but as soon as arrays are involved (e.g. with granular synthesis, where sections of audio are played back and manipulated in a non-linear fashion), there is a significant performance hit. Even with the best optimization, CPU usage is 30-40% greater than with the Objective-C/C version. I'm new to Swift, so perhaps there are other optimizations of which I am ignorant, but as far as I can tell, C/C++ are still the best choice for realtime audio. Also look at The Amazing Audio Engine. I'm considering this, as well as direct use of the older C API.

If you need to process live audio, then AVAudioEngine may not be for you. See my answer to this question: I want to call 20 times per second the installTapOnBus:bufferSize:format:block: