I'm writing a custom Movie Recording app and have implemented AVAssetWriter and AVAssetWriterInputPixelBufferAdaptor for writing frames to a file. In the DataOutputDelegate callback I am trying to apply a CIFilter to the sampleBuffer. First I get a CVPixelBufferRef and then create a CIImage. I then apply the CIIFilter and grab the resulting CIImage:
CVPixelBufferRef pixelBuffer = (CVPixelBufferRef)CMSampleBufferGetImageBuffer(sampleBuffer);
CIImage *image = [CIImage imageWithCVPixelBuffer:pixelBuffer];
CIFilter *hueAdjust = [CIFilter filterWithName:@"CIHueAdjust"];
[hueAdjust setDefaults];
[hueAdjust setValue: image forKey: @"inputImage"];
[hueAdjust setValue: [NSNumber numberWithFloat: 2.094]
forKey: @"inputAngle"];
CIImage *result = [hueAdjust valueForKey: @"outputImage"];
CVPixelBufferRef newBuffer = //Convert CIImage...
How would I go about converting that CIImage so I can:
[self.filteredImageWriter appendPixelBuffer:newBuffer withPresentationTime:lastSampleTime];
UPDATED ANSWER
While my previous method works, I was able to tweak it to simplify the code. It also seems to runs slightly faster.
samplePixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(samplePixelBuffer, 0); // NOT SURE IF NEEDED // NO PERFORMANCE IMPROVEMENTS IF REMOVED
NSDictionary *options = [NSDictionary dictionaryWithObject:(__bridge id)rgbColorSpace forKey:kCIImageColorSpace];
outputImage = [CIImage imageWithCVPixelBuffer:samplePixelBuffer options:options];
//-----------------
// FILTER OUTPUT IMAGE
@autoreleasepool {
outputImage = [self applyEffectToCIImage:outputImage dict:dict];
}
CVPixelBufferUnlockBaseAddress(samplePixelBuffer, 0); // NOT SURE IF NEEDED // NO PERFORMANCE IMPROVEMENTS IF REMOVED
//-----------------
// RENDER OUTPUT IMAGE BACK TO PIXEL BUFFER
[self.filterContext render:outputImage toCVPixelBuffer:samplePixelBuffer bounds:[outputImage extent] colorSpace:CGColorSpaceCreateDeviceRGB()]; // DOES NOT SEEM TO WORK USING rgbColorSpace
PREVIOUS ANSWER
I just implemented the following to get the pixel buffer from a CIImage. Make sure your pixel formats are consistent otherwise you will have color issues. Also the CFDictionaries are very important. http://allmybrain.com/2011/12/08/rendering-to-a-texture-with-ios-5-texture-cache-api/
CFDictionaryRef empty = CFDictionaryCreate(kCFAllocatorDefault, // EMPTY IOSURFACE DICT
NULL,
NULL,
0,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
CFMutableDictionaryRef attributes = CFDictionaryCreateMutable(kCFAllocatorDefault,
1,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
CFDictionarySetValue(attributes,kCVPixelBufferIOSurfacePropertiesKey,empty);
CVPixelBufferRef pixelBuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault,
outputImage.extent.size.width,
outputImage.extent.size.height,
kCVPixelFormatType_32BGRA,
attributes,
&pixelBuffer);
if (status == kCVReturnSuccess && pixelBuffer != NULL) {
CVPixelBufferLockBaseAddress(pixelBuffer, 0); // NOT SURE IF NEEDED // KEPT JUST IN CASE
[self.filterContext render:outputImage toCVPixelBuffer:pixelBuffer bounds:[outputImage extent] colorSpace:CGColorSpaceCreateDeviceRGB()];
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0); // NOT SURE IF NEEDED // KEPT JUST IN CASE
}