Design issue related to my problem with NSSound -isPlaying

  • I wanted to thank everyone again for the assistance and advice delivered on thread "[NSSound isPlaying] fails to indicate sound termination in Lion".  I know what to do with that one now, and if Apple ever says anything about my bug report, I will let those of you who have helped most know what they said.

    There is a related design issue, however, for which I no longer have a good Cocoa solution; perhaps some of you could advise me.

    The issue is synching animations -- visual ones -- with sounds.  Think of it as lip-synching.  You will recall that what I wanted to do was have a cat-face appear, open its mouth, say (play sound) "meow", then close its mouth and go away.  For this little effect to look best, the open-mouth animation should end just as the sound begins, and the close-mouth animation should start just as the sound ends.  I know I can't count on that happening if I the user has too many processes running and the underlying Unix scheduler decides to swap at just the wrong time, but let's assume that my App is getting plenty of cycles.

    There is no problem issuing [meow play] as the next instruction after the end of the open-mouth animation; the issue is doing the best job of making the close-mouth animation start just as the sound ends.  The problem with a call-back method, like the NSSound delegate method "didFinishPlaying:" is that that method evidently works by posting an event of some sort to the main event loop, or perhaps in a dispatch queue somewhere, and if there are other events preceding it in either the event loop or the dispatch queue, the operation of "didFinishPlaying:" may be delayed.  (My app is a Lisp REPL, it occasionally does some big chunks of work; fractional-second delays do happen, and a few of those in the queue would noticeably mess up the timing of the animation.)

    It was for that reason that I originally chose a polling loop

        while( [meow isPlaying] )
            ;

    to establish that the sound had ended.  The code to close the cat's mouth originally followed the end of the while loop.  I did and do know that polling loops are bad form for various reasons, but to get a cute and infrequently-used special effect looking just right, I thought that a half-second polling loop was a defensible coding choice.

    That polling loop doesn't work under Lion -- so be it.  And a call-back may be delayed for the reasons just discussed.

    The design issue is, what else can I do?

    To give you a feel for an approach that would be faster, but that has other disadvantages and might be difficult to implement in Cocoa, remember that back when dinosaurs ruled the earth and we all knew how to program 8259s, we could have just set up some kind of interrupt -- arranged that sound termination cause code to run that preempts the processor's resources (or at least, the main thread's), in order to run the close-mouth animation right NOW!  I do know how to use Unix's SIGINT facility, but I am not sure there is a way to get sound termination to generate an interrupt, and I would be a bit uneasy having an interrupt handler try to do something -- run the animation -- that normally is handled in the main event loop.

    So: In Cocoa, is there a reasonable approach to getting things to happen quickly, that works when call-back on the main event loop isn't fast enough?  It is probably not a matter of great consequence for my situation, but it might matter more to others, and it is an interesting design challenge in any case.

    --  Jay Reynolds Freeman
    ---------------------
    <Jay_Reynolds_Freeman...>
    http://JayReynoldsFreeman.com (personal web site)
  • On 23/07/2012, at 8:10 PM, Jay Reynolds Freeman wrote:

    > So: In Cocoa, is there a reasonable approach to getting things to happen quickly, that works when call-back on the main event loop isn't fast enough?  It is probably not a matter of great consequence for my situation, but it might matter more to others, and it is an interesting design challenge in any case.

    The main event loop is fast enough. The problem you have is that you're scheduling work on it that isn't fast enough. Perhaps you need to move that work to a thread, or break it into chunks that can be done over several runs of the loop.

    Synching sound and visual animation usually works by triggering events at the same time and arranging that they take about the same time. Core Animation includes callbacks to trigger the next step in a series of animations, and that can include sound, and sound playback includes events that can be used to trigger animations or sounds.

    Used properly, Core Animation is pretty powerful, and it doesn't demand any special techniques to make it "fast enough". Its animations are done using threads anyway.

    --Graham
  • On Jul 23, 2012, at 3:10 AM, Jay Reynolds Freeman <jay_reynolds_freeman...> wrote:

    >
    > The issue is synching animations -- visual ones -- with sounds.  Think of it as lip-synching.  You will recall that what I wanted to do was have a cat-face appear, open its mouth, say (play sound) "meow", then close its mouth and go away.  For this little effect to look best, the open-mouth animation should end just as the sound begins, and the close-mouth animation should start just as the sound ends.  I know I can't count on that happening if I the user has too many processes running and the underlying Unix scheduler decides to swap at just the wrong time, but let's assume that my App is getting plenty of cycles.
    >
    > There is no problem issuing [meow play] as the next instruction after the end of the open-mouth animation; the issue is doing the best job of making the close-mouth animation start just as the sound ends.  The problem with a call-back method, like the NSSound delegate method "didFinishPlaying:" is that that method evidently works by posting an event of some sort to the main event loop, or perhaps in a dispatch queue somewhere, and if there are other events preceding it in either the event loop or the dispatch queue, the operation of "didFinishPlaying:" may be delayed.  (My app is a Lisp REPL, it occasionally does some big chunks of work; fractional-second delays do happen, and a few of those in the queue would noticeably mess up the timing of the animation.)

    NSSound is not the appropriate API for this. It makes no guarantees about its responsiveness. Look into AVFoundation.

    --Kyle Sluder
  • On Jul 23, 2012, at 7:53 AM, Kyle Sluder <kyle...> wrote:

    > NSSound is not the appropriate API for this. It makes no guarantees about its responsiveness. Look into AVFoundation.

    Agreed. NSSound is basically a convenience for playing beep sounds as UI feedback, nothing more. It's not even guaranteed to _start_ the sound when you want: I've had trouble in the past with it blocking for up to a second while reading and decoding the audio file from disk.

    If you need synchronization, why not just make a movie with an embedded soundtrack and play that? Synchronization of audio to video is exactly what AV playback frameworks excel at — there is a lot of complexity to it, as you're discovering, and why should you have to re-invent the wheel?

    —Jens
previous month july 2012 next month
MTWTFSS
            1
2 3 4 5 6 7 8
9 10 11 12 13 14 15
16 17 18 19 20 21 22
23 24 25 26 27 28 29
30 31          
Go to today