Leak when animating Core Animation Superlayer

  • Hi,

    I'm currently writing an app that creates a complex hierarchy of CALayers.
    Everytime I animate one of the layers, I get a memory leak.
    In order to investigate the issue, I created a simple Cocoa project in which
    I only added a CustomView and the following code for this view :

    @implementation CustomView

    - (void) awakeFromNib
    {
        CALayer *mainLayer = [[CALayer layer] retain];
        [self setLayer:mainLayer];
        [self setWantsLayer:YES];

        // create a dummy sublayer to mainLayer
        l1 = [CALayer layer];  // l1 is defined as a CALayer * in the header
    file
        l1.frame = CGRectMake(0.0, 0.0, 80.0, 30.0);
        [mainLayer addSublayer:l1];

        // create a dummy sublayer to l1
        l2 = [CATextLayer layer]; // l2 is defined as a CATextLayer * in the
    header file
        l2.frame = CGRectMake(10.0, 10.0, 60.0, 20.0);
        l2.string = @"test";
        [l1 addSublayer:l2];

        // trigger a repeating layer animation
        [NSTimer scheduledTimerWithTimeInterval:0.4 target:self
    selector:@selector(fromTimer:) userInfo:nil repeats:YES];
    }

    - (void) fromTimer: (NSTimer *) t
    {
        if (l1.opacity == 1.0)
            l1.opacity = 0.2;
        else
            l1.opacity = 1.0;
    }

    @end

    When I run the program in MallocDebug, the memory usage goes up every time
    the timer function is executed.
    If I change the animation to be on l2 instead of l1, or if I create l2 as a
    sublayer of mainLayer, the memory usage remains constant.
    Am I doing something wrong ?

    Thanks
    Stephane
  • On May 31, 2008, at 16:39, "Stéphane Droux" <sdroux...>

    > When I run the program in MallocDebug, the memory usage goes up
    every time the timer function is executed. If I change the animation
    to be on l2 instead of l1, or if I create l2 as a sublayer of
    mainLayer, the memory usage remains constant. Am I doing something
    wrong ?

    An NSTimer is not really the appropriate way to do this. Have a look
    at CABasicAnimation, which you might use in the following manner
    (untested code written on an iPhone, so no guarantees):

    CABasicAnimation *anim = [CABasicAnimation
    animationWithKeyPath:@"opacity"];

    [anim setFromValue:1.0];
    [anim setToValue:0.2];
    [anim setAutoreverses:YES];
    [anim setDuration:1.0];

    [l2 addAnimation:anim forKey:nil];

    I would recommend reviewing the Core Animation documentation for
    additional examples.

    /brian
  • On Sat, May 31, 2008 at 11:14 PM, Brian Christensen <brian...>
    wrote:

    > On May 31, 2008, at 16:39, "Stéphane Droux" <sdroux...>
    >
    >> When I run the program in MallocDebug, the memory usage goes up every
    > time the timer function is executed. If I change the animation to be on l2
    > instead of l1, or if I create l2 as a sublayer of mainLayer, the memory
    > usage remains constant. Am I doing something wrong ?
    >
    >
    > An NSTimer is not really the appropriate way to do this. Have a look at
    > CABasicAnimation, which you might use in the following manner (untested code
    > written on an iPhone, so no guarantees):
    >
    > CABasicAnimation *anim = [CABasicAnimation animationWithKeyPath:@
    > "opacity"];
    >
    > [anim setFromValue:1.0];
    > [anim setToValue:0.2];
    > [anim setAutoreverses:YES];
    > [anim setDuration:1.0];
    >
    > [l2 addAnimation:anim forKey:nil];
    >
    > I would recommend reviewing the Core Animation documentation for additional
    > examples.
    >
    > /brian
    >

    Actually the animation in my example is only to reproduce the leak. I could
    have made it to pick up any random number for opacity (or any other
    property), in which case an autoreverse animation wouldn't work, but the
    program would still leak.

    The problem I have is that when I run several animations on a layer that is
    not a leaf, I get a leak.
    I tried implicit animations and explicit animations and they both leak.
    The timer in my example is only to simulate several animations.

    When I run the program in Mallocdebug, it seems that one of the threads used
    by Core Animation to animate the layers doesn't release its memory.
    So it looks like a bug in Core Animation. However, since animating non-leaf
    layers is such a core feature of Core Animation, I guess there's something
    missing in my code

    Thanks
    Stephane
  • On Jun 1, 2008, at 2:55 , Stéphane Droux wrote:

    > When I run the program in Mallocdebug, it seems that one of the
    > threads used
    > by Core Animation to animate the layers doesn't release its memory.
    > So it looks like a bug in Core Animation. However, since animating
    > non-leaf
    > layers is such a core feature of Core Animation, I guess there's
    > something
    > missing in my code

    Are you sure you're getting a leak? I ran your sample code in
    Instruments with the Object Alloc and Leaks tools and I didn't detect
    any leaks. Object allocation and memory usage remained constant.
    Whether l1 or l2 were being animated made no discernible difference.

    /brian
  • On Sun, Jun 1, 2008 at 8:49 AM, Brian Christensen <brian...> wrote:

    > On Jun 1, 2008, at 2:55 , Stéphane Droux wrote:
    >
    > When I run the program in Mallocdebug, it seems that one of the threads
    >> used
    >> by Core Animation to animate the layers doesn't release its memory.
    >> So it looks like a bug in Core Animation. However, since animating
    >> non-leaf
    >> layers is such a core feature of Core Animation, I guess there's something
    >> missing in my code
    >>
    >
    > Are you sure you're getting a leak? I ran your sample code in Instruments
    > with the Object Alloc and Leaks tools and I didn't detect any leaks. Object
    > allocation and memory usage remained constant. Whether l1 or l2 were being
    > animated made no discernible difference.
    >
    > /brian
    >
    >

    Brian,

    You're right, when l1 is animated the memory usage increases for a while and
    then reaches a peak from which it remains constant.

    I think my example wasn't complex enough to reproduce my application
    behaviour.
    I've changed the timer function to do random animations and this time it
    really "leaks":

    - (void) fromTimer: (NSTimer *) t
    {
        l2.frame = CGRectMake((double)random() / RAND_MAX*30, (double)random() /
    RAND_MAX*30, (double)random() / RAND_MAX*40, (double)random() /
    RAND_MAX*40);
        l1.opacity = (double)random() / RAND_MAX;
    }

    I ran it in Object alloc and can see a trend of increasing memory usage :
    the memory usage keeps changing up and down but the general trend is up and
    it doubled after a minute or so.

    Could that be caused by some kind of caching in  Core Animation ?
    If it is, is there a way to flush the cache ?

    Thanks
    Stephane
  • Well, I'm novice to Cocoa/ObjC programming, so, I might be wrong. But
    I see you're setting a new frame on l2 every time fromTimer: is
    called, but you never release the old one. The following may solve it,
    if that's the problem.

    - (void) fromTimer: (NSTimer *) t
    {
        [l2.frame release]
        l2.frame = CGRectMake((double)random() / RAND_MAX*30, (double)random() /
            RAND_MAX*30, (double)random() / RAND_MAX*40, (double)random() /
            RAND_MAX*40);
        l1.opacity = (double)random() / RAND_MAX;
    }

    Or are you using the garbage collector?
    By the way, shouldn't this be done via an accessor method? [l2 setFrame:...]

    --
    Gustavo Eulalio
    <guga.emc...>

    On Sun, Jun 1, 2008 at 6:03 AM, Stéphane Droux <sdroux...> wrote:
    > On Sun, Jun 1, 2008 at 8:49 AM, Brian Christensen <brian...> wrote:
    >
    >> On Jun 1, 2008, at 2:55 , Stéphane Droux wrote:
    >>
    >> When I run the program in Mallocdebug, it seems that one of the threads
    >>> used
    >>> by Core Animation to animate the layers doesn't release its memory.
    >>> So it looks like a bug in Core Animation. However, since animating
    >>> non-leaf
    >>> layers is such a core feature of Core Animation, I guess there's something
    >>> missing in my code
    >>>
    >>
    >> Are you sure you're getting a leak? I ran your sample code in Instruments
    >> with the Object Alloc and Leaks tools and I didn't detect any leaks. Object
    >> allocation and memory usage remained constant. Whether l1 or l2 were being
    >> animated made no discernible difference.
    >>
    >> /brian
    >>
    >>
    >
    > Brian,
    >
    > You're right, when l1 is animated the memory usage increases for a while and
    > then reaches a peak from which it remains constant.
    >
    > I think my example wasn't complex enough to reproduce my application
    > behaviour.
    > I've changed the timer function to do random animations and this time it
    > really "leaks":
    >
    > - (void) fromTimer: (NSTimer *) t
    > {
    > l2.frame = CGRectMake((double)random() / RAND_MAX*30, (double)random() /
    > RAND_MAX*30, (double)random() / RAND_MAX*40, (double)random() /
    > RAND_MAX*40);
    > l1.opacity = (double)random() / RAND_MAX;
    > }
    >
    >
    > I ran it in Object alloc and can see a trend of increasing memory usage :
    > the memory usage keeps changing up and down but the general trend is up and
    > it doubled after a minute or so.
    >
    > Could that be caused by some kind of caching in  Core Animation ?
    > If it is, is there a way to flush the cache ?
    >
    > Thanks
    > Stephane
    > _______________________________________________
  • On Jun 1, 2008, at 5:03 , Stéphane Droux wrote:

    > I've changed the timer function to do random animations and this
    > time it
    > really "leaks":
    >
    > - (void) fromTimer: (NSTimer *) t
    > {
    > l2.frame = CGRectMake((double)random() / RAND_MAX*30,
    > (double)random() /
    > RAND_MAX*30, (double)random() / RAND_MAX*40, (double)random() /
    > RAND_MAX*40);
    > l1.opacity = (double)random() / RAND_MAX;
    > }
    >
    >
    > I ran it in Object alloc and can see a trend of increasing memory
    > usage :
    > the memory usage keeps changing up and down but the general trend is
    > up and
    > it doubled after a minute or so.
    >
    > Could that be caused by some kind of caching in  Core Animation ?
    > If it is, is there a way to flush the cache ?

    Even with this new code I'm still not observing any leaking. Are you
    using garbage collection? With GC enabled you will observe
    fluctuations until the collector gets a chance to free up unused
    memory, but even then after a few minutes or so the usage level should
    periodically return to a reduced level.

    I am not privy to the caching Core Animation is doing internally, but
    also keep in mind that it maintains both presentation and model layers
    behind the scenes, in addition to whatever internal caching might be
    happening to improve performance.

    What kind of hardware are you running? I suspect we may be seeing some
    differences in our results based on that (perhaps different graphics
    hardware is causing Core Animation to have to do more - or different -
    work on your machine). I still don't see why this would be leaking on
    your machine though.

    /brian
  • On Sun, Jun 1, 2008 at 1:15 PM, Gustavo Eulalio <guga.emc...> wrote:

    > Well, I'm novice to Cocoa/ObjC programming, so, I might be wrong. But
    > I see you're setting a new frame on l2 every time fromTimer: is
    > called, but you never release the old one. The following may solve it,
    > if that's the problem.
    >
    > - (void) fromTimer: (NSTimer *) t
    > {
    > [l2.frame release]
    > l2.frame = CGRectMake((double)random() / RAND_MAX*30, (double)random()
    > /
    > RAND_MAX*30, (double)random() / RAND_MAX*40, (double)random() /
    > RAND_MAX*40);
    > l1.opacity = (double)random() / RAND_MAX;
    > }
    >
    > Or are you using the garbage collector?
    > By the way, shouldn't this be done via an accessor method? [l2
    > setFrame:...]
    >
    >
    frame is a CGRect readwrite property. It can be changed via l2.frame and
    doesn't need any retain/release
  • On Sun, Jun 1, 2008 at 1:40 PM, Brian Christensen <brian...> wrote:

    >
    > Even with this new code I'm still not observing any leaking. Are you using
    > garbage collection? With GC enabled you will observe fluctuations until the
    > collector gets a chance to free up unused memory, but even then after a few
    > minutes or so the usage level should periodically return to a reduced level.
    >
    > I am not privy to the caching Core Animation is doing internally, but also
    > keep in mind that it maintains both presentation and model layers behind the
    > scenes, in addition to whatever internal caching might be happening to
    > improve performance.
    >
    > What kind of hardware are you running? I suspect we may be seeing some
    > differences in our results based on that (perhaps different graphics
    > hardware is causing Core Animation to have to do more - or different - work
    > on your machine). I still don't see why this would be leaking on your
    > machine though.
    >
    > /brian
    >
    >
    I was running it without garbage collection. I recompiled it with Garbage
    collection and get the exact same problem.
    I let it run for 5 minutes and the memory usage keeps increasing. Even if
    there are fluctuations, the general trend is up and it never does down.

    Regarding the internal mechanisms of Core Animation, the items in
    MallocDebug seem to show that the memory is eaten by the Rendering tree,
    which is a bit worrying since it's a private part of Core Animation on which
    I don't think I can do anything.

    I run it on a Macbook (first model). The integrated GC might a reason for
    Core Animation to use more cache/memory, but I don't see why this memory
    would keep increasing.
  • On 6/1/08, Stéphane Droux <sdroux...> wrote:
    > On Sun, Jun 1, 2008 at 1:40 PM, Brian Christensen <brian...> wrote:
    >
    >>
    >> Even with this new code I'm still not observing any leaking. Are you using
    >> garbage collection? With GC enabled you will observe fluctuations until
    >> the
    >> collector gets a chance to free up unused memory, but even then after a
    >> few
    >> minutes or so the usage level should periodically return to a reduced
    >> level.
    >>
    >> I am not privy to the caching Core Animation is doing internally, but also
    >> keep in mind that it maintains both presentation and model layers behind
    >> the
    >> scenes, in addition to whatever internal caching might be happening to
    >> improve performance.
    >>
    >> What kind of hardware are you running? I suspect we may be seeing some
    >> differences in our results based on that (perhaps different graphics
    >> hardware is causing Core Animation to have to do more - or different -
    >> work
    >> on your machine). I still don't see why this would be leaking on your
    >> machine though.
    >>
    >> /brian
    >>
    >>
    > I was running it without garbage collection. I recompiled it with Garbage
    > collection and get the exact same problem.
    > I let it run for 5 minutes and the memory usage keeps increasing. Even if
    > there are fluctuations, the general trend is up and it never does down.
    >
    > Regarding the internal mechanisms of Core Animation, the items in
    > MallocDebug seem to show that the memory is eaten by the Rendering tree,
    > which is a bit worrying since it's a private part of Core Animation on which
    > I don't think I can do anything.
    >
    > I run it on a Macbook (first model). The integrated GC might a reason for
    > Core Animation to use more cache/memory, but I don't see why this memory
    > would keep increasing.

    If you kill the timer after its been running for a while, does the
    memory usage drop back down?  Maybe the implicit animations are never
    completing before a new one gets added, so they're just stacking up on
    top of each other.
  • On Sun, Jun 1, 2008 at 3:18 PM, Jonathan del Strother <
    <maillist...> wrote:

    >
    >
    > If you kill the timer after its been running for a while, does the
    > memory usage drop back down?  Maybe the implicit animations are never
    > completing before a new one gets added, so they're just stacking up on
    > top of each other.
    >

    I don't think they should stack up. Core animation default duration is 0.25
    second and these animations are triggered every 0.4 secs.
    Anyway, I added a timer to invalidate the first one after 60 seconds. The
    memory usage only became constant. It was about 60% more than when the
    application started. No memory was released.
  • On Jun 1, 2008, at 10:28 , Stéphane Droux wrote:

    >> If you kill the timer after its been running for a while, does the
    >> memory usage drop back down?  Maybe the implicit animations are never
    >> completing before a new one gets added, so they're just stacking up
    >> on
    >> top of each other.
    >>
    >
    > I don't think they should stack up. Core animation default duration
    > is 0.25
    > second and these animations are triggered every 0.4 secs.
    > Anyway, I added a timer to invalidate the first one after 60
    > seconds. The
    > memory usage only became constant. It was about 60% more than when the
    > application started. No memory was released.

    I would consider that to be expected behavior. If you aren't ever
    releasing the layers you created, why would any of the relevant memory
    be freed? The timer and the animations it is causing to be performed
    should not really be incurring a very significant memory footprint in
    addition to what the layers on their own are already using (my own
    observations at least indicate that running the test app with or
    without the timer makes very little difference in that regard).

    Are the two methods you posted really the only two methods in your
    entire test app? Or are you doing something else somewhere in addition
    to this? Feel free to e-mail me the test project off-list if you like.

    /brian
  • On Sun, Jun 1, 2008 at 6:31 PM, Brian Christensen <brian...> wrote:

    >
    > I would consider that to be expected behavior. If you aren't ever releasing
    > the layers you created, why would any of the relevant memory be freed? The
    > timer and the animations it is causing to be performed should not really be
    > incurring a very significant memory footprint in addition to what the layers
    > on their own are already using (my own observations at least indicate that
    > running the test app with or without the timer makes very little difference
    > in that regard).
    >
    > Are the two methods you posted really the only two methods in your entire
    > test app? Or are you doing something else somewhere in addition to this?
    > Feel free to e-mail me the test project off-list if you like.
    >
    > /brian
    >
    > Yes, these 2 methods are my entire test app. I will email you the project
    off-list.

    I don't think it has to do with layer release.
    If I run it without the animation (commented out the timer creation), the
    memory usage in Object Alloc is constant (as expected) at 1.8MB
    When I add the timer, the memory usage starts at 1.8MB and goes up to 3MB
    after 1 minute, 5MB after 2 minutes, and so on.
    The only difference between these 2 cases is "[NSTimer
    scheduledTimerWithTimeInterval:0.4 target:self selector:@selector(fromTimer:)
    userInfo:nil repeats:YES];" being commented out.

    Thanks
    Stephane
  • Hi,

    if you think you've found a memory leak inside of CoreAnimation,
    please file a radar (bugreporter.apple.com) with a project showing the
    leak, and we'll look into it. thanks,

    John

    On Jun 1, 2008, at 10:58 AM, Stéphane Droux wrote:

    > On Sun, Jun 1, 2008 at 6:31 PM, Brian Christensen
    > <brian...> wrote:
    >
    >>
    >> I would consider that to be expected behavior. If you aren't ever
    >> releasing
    >> the layers you created, why would any of the relevant memory be
    >> freed? The
    >> timer and the animations it is causing to be performed should not
    >> really be
    >> incurring a very significant memory footprint in addition to what
    >> the layers
    >> on their own are already using (my own observations at least
    >> indicate that
    >> running the test app with or without the timer makes very little
    >> difference
    >> in that regard).
    >>
    >> Are the two methods you posted really the only two methods in your
    >> entire
    >> test app? Or are you doing something else somewhere in addition to
    >> this?
    >> Feel free to e-mail me the test project off-list if you like.
    >>
    >> /brian
    >>
    >> Yes, these 2 methods are my entire test app. I will email you the
    >> project
    > off-list.
    >
    > I don't think it has to do with layer release.
    > If I run it without the animation (commented out the timer
    > creation), the
    > memory usage in Object Alloc is constant (as expected) at 1.8MB
    > When I add the timer, the memory usage starts at 1.8MB and goes up
    > to 3MB
    > after 1 minute, 5MB after 2 minutes, and so on.
    > The only difference between these 2 cases is "[NSTimer
    > scheduledTimerWithTimeInterval:0.4 target:self
    > selector:@selector(fromTimer:)
    > userInfo:nil repeats:YES];" being commented out.
    >
    > Thanks
    > Stephane
previous month may 2008 next month
MTWTFSS
      1 2 3 4
5 6 7 8 9 10 11
12 13 14 15 16 17 18
19 20 21 22 23 24 25
26 27 28 29 30 31  
Go to today