I'm building an app that lets the user take a photo or select one from the library on the iPhone and upload it to the Parse backend.
The problem I'm facing is regarding to the size of the file.
I've read about what big players like Facebook, Twitter, Instagram, and Google do regarding resolution and file size but I can't get close to that.
I'm sure they have the best code and tools to do that but I'll be happy to implement it as good as possible with iOS regular processes.
This is what I'm doing right now:
- (UIImage *)normalResImageForAsset:(ALAsset*)asset
{
// Convert ALAsset to UIImage
UIImage *image = [self highResImageForAsset:asset];
// Determine output size
CGFloat maxSize = 1024.0f;
CGFloat width = image.size.width;
CGFloat height = image.size.height;
CGFloat newWidth = width;
CGFloat newHeight = height;
// If any side exceeds the maximun size, reduce the greater side to 1200px and proportionately the other one
if (width > maxSize || height > maxSize) {
if (width > height) {
newWidth = maxSize;
newHeight = (height*maxSize)/width;
} else {
newHeight = maxSize;
newWidth = (width*maxSize)/height;
}
}
// Resize the image
CGSize newSize = CGSizeMake(newWidth, newHeight);
UIGraphicsBeginImageContext(newSize);
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
// Set maximun compression in order to decrease file size and enable faster uploads & downloads
NSData *imageData = UIImageJPEGRepresentation(newImage, 0.0f);
UIImage *processedImage = [UIImage imageWithData:imageData];
return processedImage;
}
I'm trying to make 1024px the maximun allowed size (both with ot height) to start some restrictions there and then I'm applying maximun compression to reduce size.
This works and cuts aproximately 50% of the image size without really damaging JPEGs but it's still a lot. Specially if photos are taken with the phone's camera and uploaded. The processed image can still easly have 1MB size which is way too much.
I'm guessing that I could be missing some useful step or using the wrong technique.
Any feedback would be greatly appreciated.
I had a similar problem and I also thought that the compression wasn't working. It turned out elsewhere in my code I was writing the file out to disk using different compression. You might be doing the same thing with the data this function returns. A good way to check that indeed the compression is effective is to do something like this:
NSData *imgData1 = UIImageJPEGRepresentation(newImage, 1.0f);
NSLog(@"1.0 size: %d", imgData1.length);
NSData *imgData2 = UIImageJPEGRepresentation(newImage, 0.7f);
NSLog(@"0.7 size: %d", imgData2.length);
NSData *imgData3 = UIImageJPEGRepresentation(newImage, 0.4f);
NSLog(@"0.4 size: %d", imgData3.length);
NSData *imgData4 = UIImageJPEGRepresentation(newImage, 0.0f);
NSLog(@"0.0 size: %d", imgData4.length);
// Don't convert NSData back to UIImage before writing to disk
[imgData4 writeToFile:imagePath atomically:YES];
I'm using an image that is 640x480 and I get file sizes ranging from 325 kB (for 1.0) down to 18 kB (for 0.0)