Creating Gray Scale Image give big leak...

  • Hi,

    I have an image viewing app that display images by pushing
    them into a NSImageView.

    I want the user to view a gray scale version of the currently
    displayed image.

    When the user selects the "Show Gray Scale" menu item the
    action routine does

        NSImage* tmpImage = [self monochromeImage:[bigImage image] ];
        //NSLog(@" mkgs: %f,%f",tmpImage.size.width,tmpImage.size.height);
        [bigImage setImage:tmpImage ];
        [bigView setImage:tmpImage];

    (bigImage is the NSImageView ).

    Monochrome image does:
    - (NSImage *) monochromeImage:(NSImage*) theImage
    {

    // convert NSImage to bitmaprep
    NSBitmapImageRep * bitmap;
    bitmap = (NSBitmapImageRep*)[theImage bestRepresentationForDevice:nil];

    // create CIImage from bitmap
    CIImage * ciImage = [[CIImage alloc] initWithBitmapImageRep:bitmap];

    // create CIFilter MaximumComponent
    CIFilter *transform = [CIFilter filterWithName:@"CIMaximumComponent"
          keysAndValues: @"inputImage", ciImage,nil];

    // get the new CIImage
    CIImage * result = [transform valueForKey:@"outputImage"];
    //NSLog(@"GS width= %f,height = %f",[result extent].size.width,[result
    extent].size.height);
    // convert back to a NSImage
    NSImage *image = [[[NSImage alloc]
        initWithSize:NSMakeSize(theImage.size.width,
    theImage.size.height)] autorelease];

    [image addRepresentation:[NSCIImageRep imageRepWithCIImage:result]];

    //NSLog(@"Reps: %@", [image representations]);

    [ciImage release];

    return image ;
    }

    The above works fine except the rascal leaks whenever I create a
    grayscale image
    I realize that no "bits" are carried back to the main program ie the
    filter fires
    every time the image is redrawn.

    I tried cutting the following code into the above to try to some bits...
    (I found it on the web...)

    - (NSBitmapImageRep *)RGBABitmapImageRepWithCImage:(CIImage *) ciImage
    {
        int width = [ciImage extent].size.width;
        int rows = [ciImage extent].size.height;
        int rowBytes = (width * 4);

        NSBitmapImageRep* rep = [[NSBitmapImageRep alloc]
    initWithBitmapDataPlanes:nil pixelsWide:width
      pixelsHigh:rows bitsPerSample:8 samplesPerPixel:4 hasAlpha:YES
    isPlanar:NO
      colorSpaceName:NSCalibratedRGBColorSpace bitmapFormat:0
    bytesPerRow:rowBytes bitsPerPixel:0];

        CGColorSpaceRef colorSpace = CGColorSpaceCreateWithName
    ( kCGColorSpaceGenericRGB );
        CGContextRef context = CGBitmapContextCreate( [rep bitmapData],
    width, rows, 8, rowBytes,
      colorSpace, kCGImageAlphaPremultipliedLast );

        CIContext* ciContext = [CIContext contextWithCGContext:context
    options:nil];
        [ciContext drawImage:ciImage atPoint:CGPointZero fromRect:
    [ciImage extent]];

    CGContextRelease( context );
    CGColorSpaceRelease( colorSpace );

    return [rep autorelease];
    }

    it worked, but the leak was worse !

    So the question is :

    Is there a way to pass an image to a proc that will transform the
    image via
    core image filters and get a new NSImage back without any leaks?

    Thanks

    Jerry
  • Hi,

    On 16.12.2007, at 21:03, Jerry LeVan wrote:

    > Is there a way to pass an image to a proc that will transform the
    > image via
    > core image filters and get a new NSImage back without any leaks?

    I do not know why your code leaks, but for me it looks a bit too
    complicated.

    For creating a gray imageRep I use the following code, which is a
    category of NSImageRep
    and works without any memory leak (and is fast):

    - (NSBitmapImageRep *) grayRepresentation
    {
        NSSize origSize = [self size];

      // create a new representation
        NSBitmapImageRep *newRep =
          [[NSBitmapImageRep alloc] initWithBitmapDataPlanes:NULL
                pixelsWide:[self pixelsWide]
                pixelsHigh:[self pixelsHigh]
                bitsPerSample:8
                samplesPerPixel:1
                hasAlpha:NO  // not allowed !
                isPlanar:NO
                colorSpaceName:NSCalibratedWhiteColorSpace
                bytesPerRow:0
                bitsPerPixel:0 ];
        // this new imagerep has (as default) a resolution of 72 dpi
        [NSGraphicsContext saveGraphicsState];
        NSGraphicsContext *context = [NSGraphicsContext
    graphicsContextWithBitmapImageRep:newRep];
        [NSGraphicsContext setCurrentContext:context];
        [self drawInRect:NSMakeRect( 0, 0, [newRep pixelsWide], [newRep
    pixelsHigh] )];
        [NSGraphicsContext restoreGraphicsState];
        [newRep setSize:origSize];

        return [newRep autorelease];
    }

    Heinrich

    --
    Heinrich Giesen
    <giesenH...>
  • On Dec 18, 2007, at 12:20 PM, Heinrich Giesen wrote:

    > Hi,
    >
    > On 16.12.2007, at 21:03, Jerry LeVan wrote:
    >
    >> Is there a way to pass an image to a proc that will transform the
    >> image via
    >> core image filters and get a new NSImage back without any leaks?
    >
    >
    > I do not know why your code leaks, but for me it looks a bit too
    > complicated.
    >
    > For creating a gray imageRep I use the following code, which is a
    > category of NSImageRep
    > and works without any memory leak (and is fast):
    >
    > - (NSBitmapImageRep *) grayRepresentation
    > {
    > NSSize origSize = [self size];
    >
    > // create a new representation
    > NSBitmapImageRep *newRep =
    > [[NSBitmapImageRep alloc] initWithBitmapDataPlanes:NULL
    > pixelsWide:[self pixelsWide]
    > pixelsHigh:[self pixelsHigh]
    > bitsPerSample:8
    > samplesPerPixel:1
    > hasAlpha:NO  // not allowed !
    > isPlanar:NO
    > colorSpaceName:NSCalibratedWhiteColorSpace
    > bytesPerRow:0
    > bitsPerPixel:0 ];
    > // this new imagerep has (as default) a resolution of 72 dpi
    > [NSGraphicsContext saveGraphicsState];
    > NSGraphicsContext *context = [NSGraphicsContext
    > graphicsContextWithBitmapImageRep:newRep];
    > [NSGraphicsContext setCurrentContext:context];
    > [self drawInRect:NSMakeRect( 0, 0, [newRep pixelsWide], [newRep
    > pixelsHigh] )];
    > [NSGraphicsContext restoreGraphicsState];
    > [newRep setSize:origSize];
    >
    > return [newRep autorelease];
    > }
    >
    > Heinrich
    >

    Heinrich thank you for your timely response, my computer memory also
    thanks you :)

    I am still trying figure out, after a several year layoff, what is
    going on in Cocoa.

    Part of my original quest was to learn how to apply a core image
    filter to an
    image and get a new image (with the filter applied) back without
    enormous leaks.

    My every attempt at the core image solution has failed miserably with
    giant leaks,
    however your technique is just what I need for my app ( ie produce a
    grayscale image).

    I have recast your code into a method...

    *******************************************

    // Returns an autoreleased grayscale version of
    //  its parameter theImage

    - (NSImage *) monochromeImage:(NSImage*) theImage
    {

    // get NSImage to bitmaprep

    NSBitmapImageRep * abitmap;
    abitmap = (NSBitmapImageRep*)[theImage bestRepresentationForDevice:nil];
    int pw =  [abitmap pixelsWide];
    int ph = [abitmap pixelsHigh];

    NSBitmapImageRep * bitmap =  [[NSBitmapImageRep alloc]
    initWithBitmapDataPlanes:NULL pixelsWide:pw
    pixelsHigh:ph bitsPerSample:8 samplesPerPixel:1
    hasAlpha:NO isPlanar:NO colorSpaceName:NSCalibratedWhiteColorSpace
    bytesPerRow:0 bitsPerPixel:0];

    [bitmap setSize: [theImage size]];

    NSImage * image =[ [NSImage alloc] initWithSize:[theImage size ] ];

    NSGraphicsContext *nsContext = [NSGraphicsContext
            graphicsContextWithBitmapImageRep:bitmap];

    [NSGraphicsContext saveGraphicsState];

    [NSGraphicsContext setCurrentContext: nsContext];

    // Do I need a lockFocus here?
    [ theImage drawAtPoint:NSMakePoint(0,0) fromRect:NSZeroRect
        operation:NSCompositeCopy fraction:1.0];

    // Restore the previous graphics context and state.
    [NSGraphicsContext restoreGraphicsState];

    [image addRepresentation: bitmap];
    [bitmap release];
    return [image autorelease];
    }

    The only problem that I have is when I rotate an image, say 45 degrees.

    I do this by drawing the (rotated) image into a new image that is
    large enough
    to hold the rotated image.

    When the new image is displayed in my NSImageView it looks nice. When
    I take
    the rotated image and apply the above method the grayscale image draws
    correctly
    but the bits that are "outside" of the rotated image in the enlarged
    image are
    rendered as black instead of white.

    Thanks,

    Jerry
  • Hello,

    On 19.12.2007, at 02:33, Jerry LeVan wrote:

    > I have recast your code into a method...
    >
    > *******************************************
    >
    > // Returns an autoreleased grayscale version of
    > //  its parameter theImage
    >
    > - (NSImage *) monochromeImage:(NSImage*) theImage
    > {
    >
    > // get NSImage to bitmaprep
    >
    > NSBitmapImageRep * abitmap;
    > abitmap = (NSBitmapImageRep*)[theImage
    > bestRepresentationForDevice:nil];
    > int pw =  [abitmap pixelsWide];
    > int ph = [abitmap pixelsHigh];
    >
    > NSBitmapImageRep * bitmap =  [[NSBitmapImageRep alloc]
    > initWithBitmapDataPlanes:NULL pixelsWide:pw
    > pixelsHigh:ph bitsPerSample:8 samplesPerPixel:1
    > hasAlpha:NO isPlanar:NO colorSpaceName:NSCalibratedWhiteColorSpace
    > bytesPerRow:0 bitsPerPixel:0];
    >
    > [bitmap setSize: [theImage size]];
    >
    > NSImage * image =[ [NSImage alloc] initWithSize:[theImage size ] ];
    >
    > NSGraphicsContext *nsContext = [NSGraphicsContext
    > graphicsContextWithBitmapImageRep:bitmap];
    >
    > [NSGraphicsContext saveGraphicsState];
    >
    > [NSGraphicsContext setCurrentContext: nsContext];
    >
    > // Do I need a lockFocus here?
    > [ theImage drawAtPoint:NSMakePoint(0,0) fromRect:NSZeroRect
    > operation:NSCompositeCopy fraction:1.0];
    >
    > // Restore the previous graphics context and state.
    > [NSGraphicsContext restoreGraphicsState];
    >
    > [image addRepresentation: bitmap];
    > [bitmap release];
    > return [image autorelease];
    > }

    No, you don't need a lockFocus. Setting a new NSGrahpicsContent did
    the job.

    Let me guess: you didn't test this method with a high resolution
    (>>72 dpi) image ?
    Setting the size of the new NSBitmapImageRep is done too early. You
    have to do it
    *after* the drawing/rendering. In my posted version I set the size after
    restoring the graphicsState, and there it works well. Drawing/rendering
    an image with another resolution than 72 dpi always seems to be a bit
    dangerous.

    Heinrich

    --
    Heinrich Giesen
    <giesenH...>
previous month december 2007 next month
MTWTFSS
          1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30
31            
Go to today