I play a video with AVPlayer. It works ok.
Now I want to get a UIImage
from the video playing (when I push a button for the moment).
Attached to my AVPlayer
there is a CALayer
that is used to display video on my UIView
.
My idea is to get an UIImage
from CALayer
during video is playing.
I do this with code from another question:
UIImage from CALayer - iPhone SDK
However my UIImage
is empty. The resolution is good but it is but fully white !!!
It seems that video doesn't write the contents of my CALayer
.
Someone can help me? Thanks
I couldn't get Meet's solution to work for me, but it got me thinking in the right direction.
Below is the code that I ended up using in my project. The method screenshotFromPlayer:maximumSize:
accepts an instance of an AVPlayer
from which to take a screenshot, and a CGSize
that will be the returned image's maximum size.
- (UIImage *)screenshotFromPlayer:(AVPlayer *)player maximumSize:(CGSize)maxSize {
CMTime actualTime;
NSError *error;
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:player.currentItem.asset];
// Setting a maximum size is not necessary for this code to
// successfully get a screenshot, but it was useful for my project.
generator.maximumSize = maxSize;
CGImageRef cgIm = [generator copyCGImageAtTime:player.currentTime
actualTime:&actualTime
error:&error];
UIImage *image = [UIImage imageWithCGImage:cgIm];
CFRelease(cgIm);
if (nil != error) {
NSLog(@"Error making screenshot: %@", [error localizedDescription]);
NSLog(@"Actual screenshot time: %f Requested screenshot time: %f", CMTimeGetSeconds(actualTime),
CMTimeGetSeconds(self.recordPlayer.currentTime));
return nil;
}
return image;
}
Note also that one could use the method generateCGImagesAsynchronouslyForTimes:completionHandler:
instead of copyCGImageAtTime:actualTime:error:
(on the instance of AVAssetImageGenerator
) to perform the image generation asynchronously.
This code sample generates a screenshot at the AVPlayer
's currentTime
, but any time could be used instead.