Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
403 views
in Technique[技术] by (71.8m points)

iphone - Decode images in background thread?

I have a background thread that loads images and displays them in the main thread. I?noticed that the background thread has almost nothing to do, because the actual image decoding seems to be done in the main thread:

alt text

So far I’ve tried calling [UIImage imageNamed:], [UIImage imageWithData:] and CGImageCreateWithJPEGDataProvider in the background thread with no difference. Is there a way to force the decoding to be done on the background thread?

There’s already a similar question here, but it does not help. As I wrote there, I?tried the following trick:

@implementation UIImage (Loading)

- (void) forceLoad
{
    const CGImageRef cgImage = [self CGImage];  

    const int width = CGImageGetWidth(cgImage);
    const int height = CGImageGetHeight(cgImage);

    const CGColorSpaceRef colorspace = CGImageGetColorSpace(cgImage);
    const CGContextRef context = CGBitmapContextCreate(
        NULL, /* Where to store the data. NULL = don’t care */
        width, height, /* width & height */
        8, width * 4, /* bits per component, bytes per row */
        colorspace, kCGImageAlphaNoneSkipFirst);

    NSParameterAssert(context);
    CGContextDrawImage(context, CGRectMake(0, 0, width, height), cgImage);
    CGContextRelease(context);
}

@end

That works (forces the image to decode), but it also triggers an apparently expensive call to ImageIO_BGR_A_TO_RGB_A_8Bit.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

I ran into similar issues with hi-res images on the new retina iPad. Images larger than the screen size (roughly) would cause major problems with UI responsiveness. These were JPGs, so getting them to decode on the background seemed to be the right thing to do. I'm still working on tightening all of this up, but Tommy's solution worked great for me. I just wanted to contribute some code to help the next person along when they're trying to identify why their UI is stuttering with large images. Here's what I ended up doing (this code runs in an NSOperation on a background queue). The example is a blend of my code and the code above:

  CGDataProviderRef dataProvider = CGDataProviderCreateWithCFData((CFDataRef)self.data);
  CGImageRef newImage = CGImageCreateWithJPEGDataProvider(dataProvider,
                                    NULL, NO, 
                                    kCGRenderingIntentDefault);


  //////////
  // force DECODE

  const int width = CGImageGetWidth(newImage);
  const int height = CGImageGetHeight(newImage);

  const CGColorSpaceRef colorspace = CGColorSpaceCreateDeviceRGB();
  const CGContextRef context = CGBitmapContextCreate(
                                                     NULL, /* Where to store the data. NULL = don’t care */
                                                     width, height, /* width & height */
                                                     8, width * 4, /* bits per component, bytes per row */
                                                     colorspace, kCGImageAlphaNoneSkipFirst);

  NSParameterAssert(context);
  CGContextDrawImage(context, CGRectMake(0, 0, width, height), newImage);
  CGImageRef drawnImage = CGBitmapContextCreateImage(context);
  CGContextRelease(context);
  CGColorSpaceRelease(colorspace);

  //////////

  self.downloadedImage = [UIImage imageWithCGImage:drawnImage];

  CGDataProviderRelease(dataProvider);
  CGImageRelease(newImage);
  CGImageRelease(drawnImage);

I'm still optimizing this. But it seems to do pretty well so far.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...