Applying filters on a video file

Eyal picture Eyal · Aug 29, 2013 · Viewed 10.9k times · Source

I want to apply filters (effects) on a video file while the video is playing.

I'm currently using @BradLarson 's (great) GPUImage framework to do so, the problem here is that the framework doesn't support audio playback while playing the video.

So I have two options:

1) Dive into the GPUImage code and change the GPUImageMovie so it will also process the audio buffers. This requires the knowledge of syncing the audio & video frames, and unfortunately i don't have it. I saw some hacks that try to play the audio with AVAudioPlayer but with a lot of sync problems.

2) Use CoreImage framework instead of GPUImage.

So I want to take a look at the second option of using the native iOS CoreImage and CIFilter to do the job.

The problem is, I couldn't find any example of how to do this with CIFilter, how do I apply filters on a video from a file?

Do I must use an AVAssetReader to read the video and process each frame? if so I'm back to my first problem of syncing the audio & video.
Or is there a way to apply the filters chain directly on the video or on the preview layer?

Appreciate any help :)

Answer

Parvez Belim picture Parvez Belim · Sep 7, 2013

Use GPUImage framework only that you are using... that is best framework until now for video filters. GO through the documentation of the framework https://github.com/BradLarson/GPUImage scroll down the page you will find the details of the filters available...

this filters are applied on the video and to write the video you have to use the GPUImageMovieWriter class...It automatically handles audio ..

you don't have to maintain it...Use shouldPassThroughAudio property of GPUImageMovieWriter and it will manage the audio on its own.

Use this tutorial for help http://www.sunsetlakesoftware.com/2012/02/12/introducing-gpuimage-framework

Here is the code where I am using GPUImage framework to crop the video and audio is stored not removed after editing.

NSURL *videoUrl = [selectedAsset defaultRepresentation].url;

GPUImageMovie *movieUrl = [[GPUImageMovie alloc] initWithURL:videoUrl];

self.cropFilter = [[GPUImageCropFilter alloc] initWithCropRegion:videoArea];
movieUrl.runBenchmark = YES;
movieUrl.playAtActualSpeed = YES;
[movieUrl addTarget:self.cropFilter];

//Setting path for temporary storing the video in document directory
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *myPathDocs =  [documentsDirectory stringByAppendingPathComponent:
                         [NSString stringWithFormat:@"CroppedVideo-%d.mov",arc4random() % 1000]];
NSURL *movieURL = [NSURL fileURLWithPath:myPathDocs];

AVURLAsset *asset = [AVURLAsset URLAssetWithURL:videoUrl options:nil];
AVAssetTrack *videoAssetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
CGAffineTransform videoTransform = videoAssetTrack.preferredTransform;

movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(videoAssetTrack.naturalSize.width, videoAssetTrack.naturalSize.height)];

[_cropFilter addTarget:movieWriter];
movieWriter.shouldPassthroughAudio = YES;

movieUrl.audioEncodingTarget = movieWriter;

[movieUrl enableSynchronizedEncodingUsingMovieWriter:movieWriter];

[self.movieWriter startRecordingInOrientation:videoTransform];
[self.movieWriter startRecording];

[movieUrl startProcessing];
__block BOOL completeRec = NO;
__unsafe_unretained typeof(self) weakSelf = self;
[self.movieWriter setCompletionBlock:^{

    [weakSelf.cropFilter removeTarget:weakSelf.movieWriter];
    [weakSelf.movieWriter finishRecording];
    [movieUrl removeTarget:weakSelf.cropFilter];
    if (!completeRec)
    {
        [weakSelf videoCropDoneUrl:movieURL];
        completeRec = YES;
    }
}];