Focus (Autofocus) not working in camera (AVFoundation AVCaptureSession)

woojtekr picture woojtekr · Mar 22, 2011 · Viewed 13k times · Source

I am using standard AVFoundation classes to capture video and show preview (http://developer.apple.com/library/ios/#qa/qa1702/_index.html)

Here is my code:

- (void)setupCaptureSession {       
    NSError *error = nil;

    [self setCaptureSession: [[AVCaptureSession alloc] init]]; 

    self.captureSession.sessionPreset = AVCaptureSessionPresetMedium;

    device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

    if ([device isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus] && [device lockForConfiguration:&error]) {
        [device setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
        [device unlockForConfiguration];
    }

    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device 
                                                                        error:&error];
    if (!input) {
        // TODO: Obsługa błędu, gdy nie uda się utworzyć wejścia
    }
    [[self captureSession] addInput:input];

    AVCaptureVideoDataOutput *output = [[[AVCaptureVideoDataOutput alloc] init] autorelease];
    [[self captureSession] addOutput:output];

    dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
    [output setSampleBufferDelegate:self queue:queue];
    dispatch_release(queue);

    output.videoSettings = 
    [NSDictionary dictionaryWithObject:
     [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] 
                                forKey:(id)kCVPixelBufferPixelFormatTypeKey];


    output.minFrameDuration = CMTimeMake(1, 15);

    [[self captureSession] startRunning];

    AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
    captureVideoPreviewLayer.frame = previewLayer.bounds;
    [previewLayer.layer insertSublayer:captureVideoPreviewLayer atIndex:0];
    [previewLayer setHidden:NO];

    mutex = YES;
}

// Delegate routine that is called when a sample buffer was written
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
       fromConnection:(AVCaptureConnection *)connection { 
    if (mutex && ![device isAdjustingFocus] && ![device isAdjustingExposure] && ![device isAdjustingWhiteBalance]) {
        // something
    }
}

// Create a UIImage from sample buffer data
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer {
    // Get a CMSampleBuffer's Core Video image buffer for the media data
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
    // Lock the base address of the pixel buffer
    CVPixelBufferLockBaseAddress(imageBuffer, 0); 

    // Get the number of bytes per row for the pixel buffer
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); 

    // Get the number of bytes per row for the pixel buffer
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
    // Get the pixel buffer width and height
    size_t width = CVPixelBufferGetWidth(imageBuffer); 
    size_t height = CVPixelBufferGetHeight(imageBuffer); 

    // Create a device-dependent RGB color space
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 

    // Create a bitmap graphics context with the sample buffer data
    CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, 
                                                 bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 
    // Create a Quartz image from the pixel data in the bitmap graphics context
    CGImageRef quartzImage = CGBitmapContextCreateImage(context); 
    // Unlock the pixel buffer
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);

    // Free up the context and color space
    CGContextRelease(context); 
    CGColorSpaceRelease(colorSpace);

    // Create an image object from the Quartz image
    UIImage *image = [UIImage imageWithCGImage:quartzImage];

    // Release the Quartz image
    CGImageRelease(quartzImage);

    return (image);
}

Everything is working OK but sometimes it have some issues:

  • Camera Focus is not working - Its random, sometimes works, sometimes not. I tried on different devices both iPhone 4 and 3GS. I tried to google it, but no results. People are only mentioning about broken devices, but I checked on 3 iPhone 4 and one iPhone 3GS. Problem is everywhere.
  • Camera is loading quite long. I am using ScannerKit API, which is also using camera for the same reasons and it is loading about two times faster than my implementation.

Any ideas what can be a problem? The first issue is definitely more important.

Answer

Ben Affleck picture Ben Affleck · Jan 20, 2014

Old question but anyway may save somebody hours of frustration. It's important to set the point of interest before calling setFocusMode, otherwise your camera will set focus to the previous focus point. Think of setFocusMode as COMMIT. Same applies to setExposureMode.

AVCam sample by Apple is totally wrong and broken.