Drawing a rectangle on top of a AVCaptureVideoPreviewLayer, possible?

Slade Villena picture Slade Villena · Dec 14, 2010 · Viewed 9.7k times · Source

I've been banging my head on this for a few days now.

I want to draw a rectangle on top of a CALayer (AVCaptureVideoPreviewLayer), which just happens to be the video feed from the camera on an iPhone4.

here's part of my setup;

    //(in function for initialization)

        -(void)initDevices {
           AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput                   deviceInputWithDevice:[AVCaptureDevicedefaultDeviceWithMediaType:AVMediaTypeVideo] error:nil];

           AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init];
           captureOutput.alwaysDiscardsLateVideoFrames = YES; 
           captureOutput.minFrameDuration = CMTimeMake(1, 30);
           dispatch_queue_t queue;
           queue = dispatch_queue_create("cameraQueue", NULL);
           [captureOutput setSampleBufferDelegate:self queue:queue];
           dispatch_release(queue);

           NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey; 
           NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA]; 
           NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key]; 
           [captureOutput setVideoSettings:videoSettings]; 
           self.captureSession = [[AVCaptureSession alloc] init];
           [self.captureSession addInput:captureInput];
           [self.captureSession addOutput:captureOutput];
           [self.captureSession setSessionPreset:AVCaptureSessionPresetHigh];   
           self.prevLayer = [AVCaptureVideoPreviewLayer layerWithSession: self.captureSession];
           self.prevLayer.frame = CGRectMake(0, 0, 400, 400); 
           self.prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
           self.prevLayer.delegate = self;
           [self.view.layer addSublayer: self.prevLayer];
    }

    - (void)captureOutput:(AVCaptureOutput *)captureOutput 
      didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
      fromConnection:(AVCaptureConnection *)connection { 

       [self performSelectorOnMainThread:@selector(drawme) withObject:nil waitUntilDone:YES];
    }

    - (void)drawme {
 [self.prevLayer setNeedsDisplay];
    }

    //delegate function that draws to a CALayer
    - (void)drawLayer:(CALayer*)layer inContext:(CGContextRef)ctx {
     NSLog(@"hello layer!");
     CGContextSetRGBFillColor (ctx, 1, 0, 0, 1);
            CGContextFillRect (ctx, CGRectMake (0, 0, 200, 100 ));
    }

Is this even possible? From my current code, I get "hello layer" printing, but the camera feed has no filled rectangle.

Any help would be awesome. :)

Answer

cookwhy picture cookwhy · Aug 20, 2015

I think you should add another layer to AVCaptureVideoPreviewLayer and I modify the example code for you. You can try it.

    //(in function for initialization)

    -(void)initDevices {
       AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput                   deviceInputWithDevice:[AVCaptureDevicedefaultDeviceWithMediaType:AVMediaTypeVideo] error:nil];

       AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init];
       captureOutput.alwaysDiscardsLateVideoFrames = YES; 
       captureOutput.minFrameDuration = CMTimeMake(1, 30);
       dispatch_queue_t queue;
       queue = dispatch_queue_create("cameraQueue", NULL);
       [captureOutput setSampleBufferDelegate:self queue:queue];
       dispatch_release(queue);

       NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey; 
       NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA]; 
       NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key]; 
       [captureOutput setVideoSettings:videoSettings]; 
       self.captureSession = [[AVCaptureSession alloc] init];
       [self.captureSession addInput:captureInput];
       [self.captureSession addOutput:captureOutput];
       [self.captureSession setSessionPreset:AVCaptureSessionPresetHigh];   
       self.prevLayer = [AVCaptureVideoPreviewLayer layerWithSession: self.captureSession];
       self.prevLayer.frame = CGRectMake(0, 0, 400, 400); 
       self.prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
       [self.view.layer addSublayer:self.prevLayer];

       self.drawLayer = [CAShapeLayer layer];
       CGRect parentBox = [self.captureVideoPreviewLayer frame];
       [self.drawLayer setFrame:parentBox];
       [self.drawLayer setDelegate:self];
       [self.drawLayer setNeedsDisplay];
       [self.captureVideoPreviewLayer addSublayer:self.drawLayer];
}

- (void)captureOutput:(AVCaptureOutput *)captureOutput 
    didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
    fromConnection:(AVCaptureConnection *)connection { 

    [self performSelectorOnMainThread:@selector(drawme) withObject:nil waitUntilDone:YES];
}

- (void)drawme {
    [self.drawLayer setNeedsDisplay];
}

//delegate function that draws to a CALayer
- (void)drawLayer:(CALayer*)layer inContext:(CGContextRef)ctx {
    NSLog(@"hello layer!");
    CGContextSetRGBFillColor (ctx, 1, 0, 0, 1);
    CGContextFillRect (ctx, CGRectMake (0, 0, 200, 100 ));
}