I'm trying to display a UIImage in real-time coming from the camera, and it seems that my UIImageView is not displaying the image properly. This is the method which a AVCaptureVideoDataOutputSampleBufferDelegate
has to implement
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
// Create a UIImage from the sample buffer data
UIImage *theImage = [self imageFromSampleBuffer:sampleBuffer];
// NSLog(@"Got an image! %f %f", theImage.size.width, theImage.size.height);
// NSLog(@"The image view is %@", imageView);
// UIImage *theImage = [[UIImage alloc] initWithData:[NSData
// dataWithContentsOfURL:[NSURL
// URLWithString:@"http://farm4.static.flickr.com/3092/2915896504_a88b69c9de.jpg"]]];
[self.session stopRunning];
[imageView setImage: theImage];
}
To get the easy problems out of the way:
setImage:theImage
to the imageView, the image is loaded correctly (and the second call to NSLog reports a non-nil object).imageFromSampleBuffer:
is fine, since NSLog reports the size to be 360x480, which is the size I expected.The code I'm using is the recently-posted AVFoundation
snippet from Apple available here.
In particular, that is the code I use which sets up the AVCaptureSession
object and friends (of which I understand very little), and creates the UIImage object from the Core Video buffers (that's the imageFromSampleBuffer
method).
Finally, I can get the application to crash if I try to send drawInRect:
to a plain UIView subclass with the UIImage
returned by imageFromSamplerBuffer
, while it doesn't crash if I use an UIImage
from a URL as above. Here is the stack trace from the debugger inside the crash (I get a EXC_BAD_ACCESS signal):
#0 0x34a977ee in decode_swap ()
#1 0x34a8f80e in decode_data ()
#2 0x34a8f674 in img_decode_read ()
#3 0x34a8a76e in img_interpolate_read ()
#4 0x34a63b46 in img_data_lock ()
#5 0x34a62302 in CGSImageDataLock ()
#6 0x351ab812 in ripc_AcquireImage ()
#7 0x351a8f28 in ripc_DrawImage ()
#8 0x34a620f6 in CGContextDelegateDrawImage ()
#9 0x34a61fb4 in CGContextDrawImage ()
#10 0x321fd0d0 in -[UIImage drawInRect:blendMode:alpha:] ()
#11 0x321fcc38 in -[UIImage drawInRect:] ()
EDIT: Here's some more information about the UIImage being returned by that bit of code.
Using the method described here, I can get to the pixels and print them, and they look ok at first glance (every value in the alpha channel is 255, for example). However, there's something slightly off with the buffer sizes. The image I get from Flickr from that URL is 375x500, and its [pixelData length]
gives me 750000 = 375*500*4
, which is the expected value. However, the pixel data of image returned from imageFromSampleBuffer:
has size 691208 = 360*480*4 + 8
, so there's 8 extra bytes in the pixel data. CVPixelBufferGetDataSize
itself returns this off-by-8 value. I thought for a moment that it could be due to allocating buffers at aligned positions in memory, but 691200 is a multiple of 256, so that doesn't explain it either. This size discrepancy is the only difference I can tell between the two UIImages, and it could be causing the trouble. Still, there's no reason allocating extra memory for the buffer should cause a EXC_BAD_ACCESS violation.
Thanks a lot for any help, and let me know if you need more information.
I had the same problem ... but I found this old post, and its method of creating the CGImageRef works!
http://forum.unity3d.com/viewtopic.php?p=300819
Here's a working sample:
app has a member UIImage theImage;
// Delegate routine that is called when a sample buffer was written
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
//... just an example of how to get an image out of this ...
CGImageRef cgImage = [self imageFromSampleBuffer:sampleBuffer];
theImage.image = [UIImage imageWithCGImage: cgImage ];
CGImageRelease( cgImage );
}
- (CGImageRef) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer // Create a CGImageRef from sample buffer data
{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer,0); // Lock the image buffer
uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0); // Get information of the image
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
CGImageRef newImage = CGBitmapContextCreateImage(newContext);
CGContextRelease(newContext);
CGColorSpaceRelease(colorSpace);
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
/* CVBufferRelease(imageBuffer); */ // do not call this!
return newImage;
}