How to get pixel information from CIImage of QT frame

  • Hi All,

    Thanks for your attention to this newbie in CoreImage programming!

    I think this must be an old topic since when I met this trouble I
    found many similar problems on line. However, there are no perfect
    solution for it, and most importantly, they are not feasible to my
    situation.

    I am now on a video analysis program. We have get the QT movie and
    show it in a OpenGLView using Core Video(mainly based on the source
    code of civideoDemo). Now we need to do pixel analysis on the frame of
    the movie. I am sure that there is no problem on the video->frame
    since we have done some composite filter based on the CIImage frame we
    get from the movie.

    It is a straight forward idea that we convert each frame into a Core
    Image object using

    [CIImage imageWithCVImageBuffer:currentFrame]

    Here currentFrame is defined by:

    CVImageBufferRef    currentFrame;

    and then our idea is to get the pixel information using
    NSBitmapImageRep.

    One solution of using  [[NSBitmapImageRep alloc] initWithCIImage]
    failed when the program run into a Exec_Bad_Access error.

    I tried another way: draw the CIImage on to a CIContext and then
    render it to bitmap data:

    NSData* pixelData = nil;
    [ciContextOffScreen render:inputCIImage toBitmap:pixelData
    rowBytes:rowBytes bounds:[inputCIImage extent] format:kCIFormatRGBAf
    colorSpace:nil];
    if(!pixelData)
      [ciContext render:inputCIImage toBitmap:pixelData rowBytes:rowBytes
    bounds:[inputCIImage extent] format:kCIFormatRGBAf colorSpace:nil];
    return [[NSBitmapImageRep alloc] initWithData:pixelData];

    But the problem is finally there is no data in pixelData. So finally
    the returned NSBitmapImageRep is empty.

    So I am now stuck here. Any expert who can give me some suggestions?

    Thanks for any of your help!

    JArod
  • On Jan 15, 2008, at 5:51 PM, JArod Wen wrote:

    > I am now on a video analysis program. We have get the QT movie and
    > show it in a OpenGLView using Core Video(mainly based on the source
    > code of civideoDemo). Now we need to do pixel analysis on the frame
    > of the movie. I am sure that there is no problem on the video->frame
    > since we have done some composite filter based on the CIImage frame
    > we get from the movie.
    >
    > It is a straight forward idea that we convert each frame into a Core
    > Image object using
    >
    > [CIImage imageWithCVImageBuffer:currentFrame]
    >
    > Here currentFrame is defined by:
    >
    > CVImageBufferRef    currentFrame;
    >
    > and then our idea is to get the pixel information using
    > NSBitmapImageRep.
    >
    > One solution of using  [[NSBitmapImageRep alloc] initWithCIImage]
    > failed when the program run into a Exec_Bad_Access error.
    >
    > I tried another way: draw the CIImage on to a CIContext and then
    > render it to bitmap data:
    >
    > NSData* pixelData = nil;
    > [ciContextOffScreen render:inputCIImage toBitmap:pixelData
    > rowBytes:rowBytes bounds:[inputCIImage extent] format:kCIFormatRGBAf
    > colorSpace:nil];
    > if(!pixelData)
    > [ciContext render:inputCIImage toBitmap:pixelData
    > rowBytes:rowBytes bounds:[inputCIImage extent] format:kCIFormatRGBAf
    > colorSpace:nil];
    > return [[NSBitmapImageRep alloc] initWithData:pixelData];
    >
    > But the problem is finally there is no data in pixelData. So finally
    > the returned NSBitmapImageRep is empty.

    Searching on 'CVImageBufferRef NSImage' turned up a decent amount of
    hits including the following link from the QTKit Capture Guide:

    <http://developer.apple.com/documentation/QuickTime/Conceptual/QTKitCaptureP
    rogrammingGuide/CreatingStopMotionApplication/chapter_5_section_7.html
    >

    In the addFrame: block of sample code, they build an image with the
    frame buffer data using an NSCIImageRep:

    NSCIImageRep *imageRep = [NSCIImageRep imageRepWithCIImage:[CIImage
    imageWithCVImageBuffer:imageBuffer]];
    NSImage *image = [[[NSImage alloc] initWithSize:[imageRep size]]
    autorelease];

    Note that there's also other lines around that (to deal with stuff
    like calling from a thread other than main), so definitely read the
    entire example and see what may apply in your case.  No idea on why
    you were getting the EXEC_BAD_ACCESS; could be that you were also
    calling things from other threads?  Could be that it's not possible to
    create an NSBitmapImageRep from such a data provider.  After all, a
    CVImageBufferRef is really an abstract type.  Since you mentioned
    OpenGL, your CVImageBufferRef is probably really a CVOpenGLBufferRef
    You may also want to move this thread to the Quicktime developers list.
    ___________________________________________________________
    Ricky A. Sharp        mailto:<rsharp...>
    Instant Interactive(tm)  http://www.instantinteractive.com
previous month january 2008 next month
MTWTFSS
  1 2 3 4 5 6
7 8 9 10 11 12 13
14 15 16 17 18 19 20
21 22 23 24 25 26 27
28 29 30 31      
Go to today