I'm using the AVFoundation framework. In my sample buffer delegate I have the following code:
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{
CVPixelBufferRef pb = CMSampleBufferGetImageBuffer(sampleBuffer);
CIImage *ciImage = [CIImage imageWithCVPixelBuffer:pb];
self.imageView.image = [UIImage imageWithCIImage:ciImage];
}
I am able to use the CIImage to run the face detector etc. but it does not show up in the UIImageView ... the imageView remains white. Any ideas as to the problem? I am using the following to setup my session:
self.session = [[AVCaptureSession alloc] init];
self.session.sessionPreset = AVCaptureSessionPreset640x480;
self.videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
self.videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:nil];
self.frameOutput = [[AVCaptureVideoDataOutput alloc] init];
self.frameOutput.videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
This might help. I was having the same issue (image not being drawn to screen) with this code:
CIImage *image = [[CIImage alloc] initWithImage:[UIImage imageNamed:@"mushroom.jpg"]];
theImageView.image = [UIImage imageWithCIImage:image];
However, after changing the code to this, it now works correctly:
CIImage *image = [[CIImage alloc] initWithImage:[UIImage imageNamed:@"mushroom.jpg"]];
CIContext *context = [CIContext contextWithOptions:nil];
theImageView.image = [UIImage imageWithCGImage:[context createCGImage:image fromRect:image.extent]];
To read more about CIContext take a look here: http://developer.apple.com/library/ios/#DOCUMENTATION/GraphicsImaging/Reference/QuartzCoreFramework/Classes/CIContext_Class/Reference/Reference.html#//apple_ref/occ/cl/CIContext