How to apply "filters" to AVCaptureVideoPreviewLayer

eschurter picture eschurter · Mar 1, 2011 · Viewed 27.4k times · Source

My app is currently using AVFoundation to take the raw camera data from the rear camera of an iPhone and display it on an AVCaptureVideoPreviewLayer in real time.

My goal is to to conditionally apply simple image filters to the preview layer. The images aren't saved, so I do not need to capture the output. For example, I would like to toggle a setting that converts the video coming in on the preview layer to Black & White.

I found a question here that seems to accomplish something similar by capturing the individual video frames in a buffer, applying the desired transformations, then displaying each frame as an UIImage. For several reasons, this seems like overkill for my project and I'd like to avoid any performance issues this may cause.

Is this the only way to accomplish my goal?

As I mentioned, I am not looking to capture any of the AVCaptureSession's video, merely preview it.

Answer

Brad Larson picture Brad Larson · Mar 1, 2011

Probably the most performant way of handling this would be to use OpenGL ES for filtering and display of these video frames. You won't be able to do much with an AVCaptureVideoPreviewLayer directly, aside from adjusting its opacity when overlaid with another view or layer.

I have a sample application here where I grab frames from the camera and apply OpenGL ES 2.0 shaders to process the video in realtime for display. In this application (explained in detail here), I was using color-based filtering to track objects in the camera view, but others have modified this code to do some neat video processing effects. All GPU-based filters in this application that display to the screen run at 60 FPS on my iPhone 4.

The only iOS device out there that supports video, yet doesn't have an OpenGL ES 2.0 capable GPU, is the iPhone 3G. If you need to target that device as well, you might be able to take the base code for video capture and generation of OpenGL ES textures, and then use the filter code from Apple's GLImageProcessing sample application. That application is built around OpenGL ES 1.1, support for which is present on all iOS devices.

However, I highly encourage looking at the use of OpenGL ES 2.0 for this, because you can pull off many more kinds of effect using shaders than you can with the fixed function OpenGL ES 1.1 pipeline.

(Edit: 2/13/2012) As an update on the above, I've now created an open source framework called GPUImage that encapsulates this kind of custom image filtering. It also handles capturing video and displaying it to the screen after being filtered, requiring as few as six lines of code to set all of this up. For more on the framework, you can read my more detailed announcement.