On a iPhone app, I need to send a jpg by mail with a maximum size of 300Ko (I don't no the maximum size mail.app can have, but it's another problem). To do that, I'm trying to decrease quality until obtain an image under 300Ko.
In order to obtain the good value of the quality (compressionLevel) who give me a jpg under 300Ko, I have made the following loop. It's working, but each time time the loop is executed, the memory increase of the size of the original size of my jpg (700Ko) despite the "[tmpImage release];".
float compressionLevel = 1.0f;
int size = 300001;
while (size > 300000) {
UIImage *tmpImage =[[UIImage alloc] initWithContentsOfFile:[self fullDocumentsPathForTheFile:@"imageToAnalyse.jpg"]];
size = [UIImageJPEGRepresentation(tmpImage, compressionLevel) length];
[tmpImage release];
//In the following line, the 0.001f decrement is choose just in order test the increase of the memory
//compressionLevel = compressionLevel - 0.001f;
NSLog(@"Compression: %f",compressionLevel);
}
Any ideas about how can i get it off, or why it happens? thanks
At the very least, there's no point in allocating and releasing the image on every trip through the loop. It shouldn't leak memory, but it's unnecessary, so move the alloc/init and release out of the loop.
Also, the data returned by UIImageJPEGRepresentation is auto-released, so it'll hang around until the current release pool drains (when you get back to the main event loop). Consider adding:
NSAutoreleasePool* p = [[NSAutoreleasePool alloc] init];
at the top of the loop, and
[p drain]
at the end. That way you'll not be leaking all of the intermediate memory.
And finally, doing a linear search for the optimal compression setting is probably pretty inefficient. Do a binary search instead.