I am trying to use CoreImage's face detection in iOS 5 but it is not detecting anything. I am trying to detect faces in an image that was just captured by the camera using this code:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
UIImage *image = [info objectForKey:@"UIImagePickerControllerOriginalImage"];
NSDictionary *detectorOptions = [[NSDictionary alloc] initWithObjectsAndKeys:CIDetectorAccuracyHigh, CIDetectorAccuracy, nil];
CIDetector *faceDetector = [CIDetector detectorOfType:CIDetectorTypeFace context:nil options:detectorOptions];
NSArray *features = [faceDetector featuresInImage:image.CIImage];
NSLog(@"Features = %@", features);
[self dismissModalViewControllerAnimated:YES];
}
This compiles and runs fine but it the features array is always empty regardless of what's in the image... Any ideas?
I can't reply to your @14:52 comment directly Vic320, but I've been playing with the front camera for face detection - I went round and round in circles since I couldn't get the front camera to pick up my face at all...
Turns out it's very sensitive to rotation - I noticed that when holding my iPad2 in portrait (as you'd expect while using the front camera) I was getting less than 10% recognition accuracy. On a whim, turned it sideways and was getting 100% recognition with the front camera.
Simple fix for this if you're using the front camera always in portrait is to add this little snippet:
NSDictionary* imageOptions = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:6] forKey:CIDetectorImageOrientation];
NSArray* features = [detector featuresInImage:image options:imageOptions];
That 6 in there forces the detector to operate in portrait mode. Apple's SquareCam Sample has a whole bunch of utility methods to figure out what orientation you're in if you need it to dynamically figure out your orientation.