I just watched the WWDC Video (Session 502 AVAudioEngine
in Practice) on AVAudioEngine
and am very excited to make an app built on this tech.
I haven't been able to figure out how I might do level monitoring of the microphone input, or a mixer's output.
Can anyone help? To be clear, I'm talking about monitoring the current input signal (and displaying this in the UI), not the input/output volume setting of a channel/track.
I know you can do this with AVAudioRecorder
, but this is not an AVAudioNode
which the AVAudioEngine
requires.
Try to install a tap on main mixer, then make it faster by setting the framelength, then read the samples and get average, something like this:
import framework on top
#import <Accelerate/Accelerate.h>
add property
@property float averagePowerForChannel0;
@property float averagePowerForChannel1;
then the below the same>>
self.mainMixer = [self.engine mainMixerNode];
[self.mainMixer installTapOnBus:0 bufferSize:1024 format:[self.mainMixer outputFormatForBus:0] block:^(AVAudioPCMBuffer * _Nonnull buffer, AVAudioTime * _Nonnull when) {
[buffer setFrameLength:1024];
UInt32 inNumberFrames = buffer.frameLength;
if(buffer.format.channelCount>0)
{
Float32* samples = (Float32*)buffer.floatChannelData[0];
Float32 avgValue = 0;
vDSP_meamgv((Float32*)samples, 1, &avgValue, inNumberFrames);
self.averagePowerForChannel0 = (LEVEL_LOWPASS_TRIG*((avgValue==0)?-100:20.0*log10f(avgValue))) + ((1-LEVEL_LOWPASS_TRIG)*self.averagePowerForChannel0) ;
self.averagePowerForChannel1 = self.averagePowerForChannel0;
}
if(buffer.format.channelCount>1)
{
Float32* samples = (Float32*)buffer.floatChannelData[1];
Float32 avgValue = 0;
vDSP_meamgv((Float32*)samples, 1, &avgValue, inNumberFrames);
self.averagePowerForChannel1 = (LEVEL_LOWPASS_TRIG*((avgValue==0)?-100:20.0*log10f(avgValue))) + ((1-LEVEL_LOWPASS_TRIG)*self.averagePowerForChannel1) ;
}
}];
then, get the target value you want
NSLog(@"===test===%.2f", self.averagePowerForChannel1);
to get the peak values, use vDSP_maxmgv instead of vDSP_meamgv.
LEVEL_LOWPASS_TRIG is a simple filter valued between 0.0 to 1.0, if you set 0.0 you will filter all values and not get any data. If you set it to 1.0 you will get too much noise. Basically the higher the value you will get more variation in data. It seems a value between 0.10 to 0.30 is good for most applications.