How to create multi-resolution NSImage programmatically?

  • Hi,

    I need to create NSImages programmatically in memory, and I want them
    to display correctly on Retina displays.

    Namely, I have two images btnImage.png and <btnImage...> in
    resources of my app bundle. At run time, I take that image [NSImage
    imageNamed:@"btnImage"] and split it into three NSImage slices and
    draw them using NSDrawThreePartImage, with respect of the current
    resolution (1x or 2x).

    I tried to imitate the way NSImage seems to handle multi-res images: I
    created an NSImage and two NSBitmapImageRep's in it, corresponding to
    the two image reps of the btnImage, and drew each rep respectively
    using -[NSGraphicsContext graphicsContextWithBitmapImageRep:] and
    [NSImageRep drawInRect:fromRect:operation:fraction:respectFlipped:hints:].

    I wonder if this is a valid approach and maybe there is another recommended way?

    As I tested it, it doesn't always work correctly. In normal NSViews -
    everything is fine. In layer-hosting views when drawn in CALayer via
    delegate, the image is chosen correctly (1x or 2x) but the 2x image is
    scaled down (sic!) so that it appears twice smaller than it should be.
    When I draw the original btnImage image in the same CALayer, it is
    selected and scaled properly in both 1x and 2x. I have no idea what is
    the problem.

    BTW, the only modification I made in my layer-hosting view to support
    retina is adding this method to the delegate implementation:

    - (BOOL)layer:(CALayer*)layer
      return YES;

previous month july 2012 next month
2 3 4 5 6 7 8
9 10 11 12 13 14 15
16 17 18 19 20 21 22
23 24 25 26 27 28 29
30 31          
Go to today