I've thoroughly read through the latest iOS8 photo frame work and I am trying to fetch some assets from the users library to display. I let the user edit 4 images at once. But because of this, I need to compress the images otherwise the app will crash.
I am using a PHImageManager to load the images via the following code:
func processImages()
{
println("Processing")
_selectediImages = Array()
_cacheImageComplete = 0
for asset in _selectedAssets
{
var options:PHImageRequestOptions = PHImageRequestOptions()
options.version = PHImageRequestOptionsVersion.Unadjusted
options.synchronous = true
var minRatio:CGFloat = 1
if(CGFloat(asset.pixelWidth) > UIScreen.mainScreen().bounds.width || CGFloat(asset.pixelHeight) > UIScreen.mainScreen().bounds.height)
{
minRatio = min(UIScreen.mainScreen().bounds.width/(CGFloat(asset.pixelWidth)), (UIScreen.mainScreen().bounds.height/CGFloat(asset.pixelHeight)))
}
var size:CGSize = CGSizeMake((CGFloat(asset.pixelWidth)*minRatio),(CGFloat(asset.pixelHeight)*minRatio))
println("Target size is \(size)")
PHImageManager.defaultManager().requestImageForAsset(asset, targetSize:size, contentMode: .AspectFill, options: options)
{
uiimageResult, info in
var image = iImage(uiimage: uiimageResult)
println("Result Size Is \(uiimageResult.size)")
}
}
}
As you can see, I am calculating the target size to make sure the image is at least no bigger than the screen. If it is, I do a ratio scale down on the image. However here is a typical print log
Target size is (768.0,798.453531598513) Result Size Is (1614.0,1678.0)
Even though I am setting the target size to 768x798 (in that specific case) the resulting UIImage it's giving me is more than double that. Now according to the documentation, the targetSize parameter
"The target size of image to be returned."
Not the clearest explanation but from my experiments it is NOT matching this.
If you have some suggestions I'd love to hear it!
In Swift
, you want to do something like this:
var asset: PHAsset!
var imageSize = CGSize(width: 100, height: 100)
var options = PHImageRequestOptions()
options.resizeMode = PHImageRequestOptionsResizeMode.Exact
options.deliveryMode = PHImageRequestOptionsDeliveryMode.Opportunistic
PHImageManager.defaultManager().requestImage(asset, targetSize: imageSize, contentMode: PHImageContentMode.AspectFill, options: options) {
(image, info) -> Void in
// what you want to do with the image here
print("Result Size Is \(image.size)")
}
In Objective-C
, it looks something like this:
void (^resultHandler)(UIImage *, NSDictionary *) = ^(UIImage *result, NSDictionary *info) {
// what you want to do with the image
};
CGSize cellSize = CGSizeMake(100, 100);
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init];
options.resizeMode = PHImageRequestOptionsResizeModeExact;
options.deliveryMode = PHImageRequestOptionsDeliveryModeOpportunistic;
[[PHImageManager defaultManager] requestImageForAsset:self.imageAsset targetSize:cellSize contentMode:PHImageContentModeAspectFill options:options resultHandler:resultHandler];
Important Note: With the Opportunistic delivery mode, the result block may be called more than once with different sizes, but the last call will be the size you want. It's better to use Opportunistic so that the UI will load a low-quality placeholder first and then update it as the OS can generate a better image (rather than having a blank square).