I want to detect not the pitch, but the pitch class of a sung note.
So, whether it is C4 or C5 is not important: they must both be detected as C.
Imagine the 12 semitones arranged on a clock face, with the needle pointing to the pitch class. That's what I'm after! ideally I would like to be able to tell whether the sung note is spot-on or slightly off.
This is not a duplicate of previously asked questions, as it introduces the constraints that:
the sound source is a single human voice, hopefully with negligible background interference (although I may need to deal with this)
the octave is not important, only the pitch class
EDIT -- Links:
Real time pitch detection
Using the Apple FFT and Accelerate Framework
See my answer here for getting smooth FREQUENCY detection: https://stackoverflow.com/a/11042551/1457445
As far as snapping this frequency to the nearest note -- here is a method I created for my tuner app:
- (int) snapFreqToMIDI: (float) frequencyy {
int midiNote = (12*(log10(frequencyy/referenceA)/log10(2)) + 57) + 0.5;
return midiNote;
}
This will return the MIDI note value (http://www.phys.unsw.edu.au/jw/notes.html)
In order to get a string from this MIDI note value:
- (NSString*) midiToString: (int) midiNote {
NSArray *noteStrings = [[NSArray alloc] initWithObjects:@"C", @"C#", @"D", @"D#", @"E", @"F", @"F#", @"G", @"G#", @"A", @"A#", @"B", nil];
return [noteStrings objectAtIndex:midiNote%12];
}
For an example implementation of the pitch detection with output smoothing, look at musicianskit.com/developer.php