I'm looking for a way to retrieve the individual frames of a video using iOS API. I tried using AVAssetImageGenerator but it seems to only provide frame to the nearest second which is a bit too rough for my usage.
From what I understand of the documentation, a pipeline of AVAssetReader, AVAssetReaderOutput and CMSampleBufferGetImageBuffer I should be able to do something but I'm stuck with a CVImageBufferRef. With this I'm looking for a way to get a CGImageRef or a UIImage but haven't found it.
Real-time is not needed and the more I can stick to provided API the better.
Thanks a lot!
Edit:
Based on this site: http://www.7twenty7.com/blog/2010/11/video-processing-with-av-foundation and this question: how to convert a CVImageBufferRef to UIImage I'm nearing on a solution. Problem, the AVAssetReader stops reading after the first copyNextSampleBuffer
without giving me anything (the sampleBuffer is NULL).
The video is readable by MPMoviePlayerController. I don't understand what's wrong.
The two links above actually answer my question and the empty copyNextBufferSample
is an issue with iOS SDK 5.0b3, it works on the device.