I've been looking for an answer to this and some seem like they might be what I need but I'm not sure. I found this Question #9152851 and this Question #2617625 and poked around on a bunch of links but I need some direction here.
Essentially, I'm dispatching an async call to process an image using OpenCV. You can see by the code here that I'm turning it into NSData * before sending it back to my delegate.
NSData *proccessedData = [NSData dataWithBytes:processedImage.data length:(processedImage.rows * processedImage.cols)];
[self.delegate onProcessedBitmapReady:proccessedData withFocusQuality:focusQuality];
But when I get back to my delegate, my processedBitmap is of type (OS_dispatch_data *) and contains a value of bytes. So, when I try to set the UIImage, it gets set to null.
- (void)onProcessedBitmapReady:(NSData *)processedBitmap withFocusQuality:(double)focusQuality
{
//Use comverted image from self.captureCommand onComplete
UIImage *image = [[UIImage alloc] initWithData:processedBitmap];
[self saveImage:image];
}
Here is a screen capture of the values:
So, how do I convert those bytes (or whatever they are) into something that I can stuff into a UIImage?
Thank you in advance for your help.
------------------------------------------------------- Adding a new image -----------------------------------------
Does this new image help?
Thank you.
NSData
is actually a class cluster which just provides the interface, and there are multiple special implementations for it around. It appears that OS_dispatch_data
is such a special implementations made to pass data objects around blocks, especially since your UIImage
creation doesn't crash (as it would if you would pass it a non NSData
object, or just garbage memory). Instead, it looks like UIImage
simply doesn't recognize the format the image is in!
By the way, Apple has a great guide about the concept of class clusters, which can be found here.