I am trying to apply filters to a video composition created with AVFoundation on iOS (filters could be, eg, blur, pixelate, sepia, etc). I need to both apply the effects in real-time and be able to render the composite video out to disk, but I'm happy to start with just one or the other.
Unfortunately, I can't seem to figure this one out. Here's what I can do:
Other apps do this (I think), so I assume I'm missing something obvious.
note: I've looked into GPUImage and I'd love to use it, but it just doesn't work well with movies, especially movies with audio. See for example:
You could use the AVVideoCompositing and AVAsynchronousVideoCompositionRequest protocol to implement a custom compositor.
CVPixelBufferRef pixelBuffer = [AVAsynchronousVideoCompositionRequest sourceFrameByTrackID:trackID];
CIImage *theImage = [CIImage imageWithCVPixelBuffer:pixelBuffer];
CIImage *motionBlurredImage = [[CIFilter *filterWithName:@"CIMotionBlur" keysAndValues:@"inputImage", theImage, nil] valueForKey:kCIOutputImageKey];
CIContext *someCIContext = [CIContext contextWithEAGLContext:eaglContext];
[someCIContext render:motionBlurredImage toCVPixelBuffer:outputBuffer];
Then render the pixel buffer using OpenGL as described in Apple's Documentation. This would allow you to implement any number of transitions or filters that you want. You can then set the AVAssetExportSession.videoCompostion and you will be able to export the composited video to disk.