[iPhone] Sample code for live camera stream?

  • I am having problems of getting "live" images from the iPhone's camera.
    I suppose this should be quite simple but I am not sure how to do it.

    Basically, what I want to do is this

      loop forever:
        grab current image from the camera
        do some computer vision processing (object detection)
        display results on the iPhone's screen (possible overlayed on the camera image)

    But I have no idea how to go about the first step in the loop, i.e., grabbing the current ("live") image from the camera.

    I did some googling, to no avail.
    The documentation of UIImagePickerController is not clear to me.
    And I couldn't find any sample code.

    So, could somebody please point me to more documentation? or to some sample code?

    Thanks a lot in advance.

    Best regards,
    Gabriel.
  • There is not currently API for this. The API allows you to place overlays on the screen, but video data is not delivered to your app until the user is finished recording.

    Luke

    On Dec 15, 2009, at 9:24 AM, Gabriel Zachmann wrote:

    > I am having problems of getting "live" images from the iPhone's camera.
    > I suppose this should be quite simple but I am not sure how to do it.
    >
    > Basically, what I want to do is this
    >
    > loop forever:
    > grab current image from the camera
    > do some computer vision processing (object detection)
    > display results on the iPhone's screen (possible overlayed on the camera image)
    >
    > But I have no idea how to go about the first step in the loop, i.e., grabbing the current ("live") image from the camera.
    >
    > I did some googling, to no avail.
    > The documentation of UIImagePickerController is not clear to me.
    > And I couldn't find any sample code.
    >
    > So, could somebody please point me to more documentation? or to some sample code?
    >
    > Thanks a lot in advance.
    >
    > Best regards,
    > Gabriel.
  • > There is not currently API for this. The API allows you to place overlays on the screen, but video data is not delivered to your app until the user is finished recording.

    So, how do all the so-called augmented reality apps do it?

    Best regards,
    Gabriel.
  • They place UI elements over the video feed provided on screen by the built in video capture window, but they don't have access to the actual video data.

    Mark

    On 15 Dec 2009, at 16:04, Gabriel Zachmann wrote:

    >> There is not currently API for this. The API allows you to place overlays on the screen, but video data is not delivered to your app until the user is finished recording.
    >
    > So, how do all the so-called augmented reality apps do it?
    >
    > Best regards,
    > Gabriel.
  • On Tue, Dec 15, 2009 at 12:54 PM, Luke the Hiesterman
    <luketheh...> wrote:
    > There is not currently API for this. The API allows you to place overlays on the screen, but video data is not delivered to your app until the user is finished recording.

    Erm, Ustream Live Broadcaster does it just fine. Are you suggesting
    they are circumventing API restrictions?

    http://www.techcrunch.com/2009/12/09/iphone-live-streaming-ustream/

    --Kyle Sluder
  • > Erm, Ustream Live Broadcaster does it just fine. Are you suggesting
    > they are circumventing API restrictions?
    >
    > http://www.techcrunch.com/2009/12/09/iphone-live-streaming-ustream/

    There was recently a clarification in the developer forums from Michael Jurewitz; I’d suggest you take a look there.

    -Ben
  • On Tue, Dec 15, 2009 at 2:24 PM, Benjamin Stiglitz <stig...> wrote:
    > There was recently a clarification in the developer forums from Michael Jurewitz; I’d suggest you take a look there.

    That would probably be that link which made its rounds on Twitter
    today, and which I couldn't view because I can't get to the iPhone dev
    forums.  :)  (Apparently access is not granted for team members, only
    to those with individual developer registrations. I can get into the
    Mac forums, though.)

    --Kyle Sluder
  • Scratch that, I have access.  It's just that the link I was provided
    did not work on my phone.  For those who also have access to the
    iPhone forums, here is the relevant link:

    https://devforums.apple.com/message/149553

    Not reproducing the contents here, obviously.

    --Kyle Sluder

    On Tue, Dec 15, 2009 at 2:31 PM, Kyle Sluder <kyle.sluder...> wrote:
    > That would probably be that link which made its rounds on Twitter
    > today, and which I couldn't view because I can't get to the iPhone dev
    > forums.  :)  (Apparently access is not granted for team members, only
    > to those with individual developer registrations. I can get into the
    > Mac forums, though.)
  • > They place UI elements over the video feed provided on screen by the built in video capture window, but they don't have access to the actual video data.

    So, how do they do this: http://www.youtube.com/watch?v=QoZRHLmUKtM ?
    Or this: http://www.youtube.com/watch?v=5M-oAmBDcZk ?

    > Erm, Ustream Live Broadcaster does it just fine. Are you suggesting
    > they are circumventing API restrictions?
    >
    > http://www.techcrunch.com/2009/12/09/iphone-live-streaming-ustream/

    According to this
      http://www.tuaw.com/2009/12/14/app-store-approved-app-brings-video-recordin
    g-to-iphone-3g-and-1/

    they use the non-documented UIGetScreenImage().

    I'd love to use that if it's the only way -- if only somebody could provide me with some simple sample code how to use it ...

    Regards,
    Gabriel.
  • On Dec 15, 2009, at 3:11 PM, Gabriel Zachmann wrote:

    > According to this
    > http://www.tuaw.com/2009/12/14/app-store-approved-app-brings-video-recordin
    g-to-iphone-3g-and-1/

    > they use the non-documented UIGetScreenImage().
    >
    > I'd love to use that if it's the only way -- if only somebody could provide me with some simple sample code how to use it .

    while (youWantImages)
    {
    CGImageRef image = UIGetScreenImage();
    //process image here
    CGImageRelease(image); //even though this is a Get function, it returns a retained image.
    }

    Luke
  • Ah that's probably why the message in the forums says that it will be superceded in the future, for proper naming and hopefully slightly more functionality).

    Thanks Apple!

    On Dec 15, 2009, at 5:14 PM, Luke the Hiesterman wrote:

    >
    > On Dec 15, 2009, at 3:11 PM, Gabriel Zachmann wrote:
    >
    >> According to this
    >> http://www.tuaw.com/2009/12/14/app-store-approved-app-brings-video-recordin
    g-to-iphone-3g-and-1/

    >> they use the non-documented UIGetScreenImage().
    >>
    >> I'd love to use that if it's the only way -- if only somebody could provide me with some simple sample code how to use it .
    >
    > while (youWantImages)
    > {
    > CGImageRef image = UIGetScreenImage();
    > //process image here
    > CGImageRelease(image); //even though this is a Get function, it returns a retained image.
    > }
    >
    > Luke

    Alex Kac - President and Founder
    Web Information Solutions, Inc.

    "If at first you don't succeed, skydiving is not for you."
    -- Francis Roberts
  • > Ah that's probably why the message in the forums says that it will be superceded in the future, for proper naming and hopefully slightly more functionality).

    So there is at least some hope that this function won't got away ... that's good news.

    Could you please point me to the forum(s) where this was discussed?

    Regards,
    Gabriel.
  • Actually the message in the forums specifically said when the new superceded function is released, you will be required to move to it. So that tells me it is going away.

    As for where in the forums - go to the Quartz2D sub-forum and look for a subject with “Notice” I think.

    On Dec 16, 2009, at 5:38 AM, Gabriel Zachmann wrote:

    >
    >> Ah that's probably why the message in the forums says that it will be superceded in the future, for proper naming and hopefully slightly more functionality).
    >
    > So there is at least some hope that this function won't got away ... that's good news.
    >
    > Could you please point me to the forum(s) where this was discussed?
    >
    > Regards,
    > Gabriel.

    Alex Kac - President and Founder
    Web Information Solutions, Inc.

    "Patience is the companion of wisdom."
    --Anonymous
  • There is only one place that iPhone specific dev is *supposed* to be discussed on an Apple site. Cocoa-Dev is for general cocoa or Mac cocoa development. I use both depending on if its iPhone specific or Cocoa general. Here is the forum:

    https://devforums.apple.com/community/iphone

    The specific post is here:
    https://devforums.apple.com/thread/34907?tstart=0

    I don't know if I'm allowed to repost the contents so I'm just going to leave a link to the post.

    On Dec 16, 2009, at 1:06 PM, Gabriel Zachmann wrote:

    > Thanks for your response.
    > I am, however, confused as to the "Quartz2D sub-forum" ... are you talking of the mailing list Quartzcomposer-dev on lists.apple.com? But why do you call it a "forum"?
    > At any rate, I couldn't find a post with subject 'Notice' there ...
    >
    > Regards,
    > Gabriel.
    >
    >
    >> Actually the message in the forums specifically said when the new superceded function is released, you will be required to move to it. So that tells me it is going away.
    >>
    >> As for where in the forums - go to the Quartz2D sub-forum and look for a subject with “Notice” I think.
    >>
    >> On Dec 16, 2009, at 5:38 AM, Gabriel Zachmann wrote:
    >>
    >>>
    >>>> Ah that's probably why the message in the forums says that it will be superceded in the future, for proper naming and hopefully slightly more functionality).
    >>>
    >>> So there is at least some hope that this function won't got away ... that's good news.
    >>>
    >>> Could you please point me to the forum(s) where this was discussed?
    >>>
    >>> Regards,
    >>> Gabriel.
    >>
    >> Alex Kac - President and Founder
    >> Web Information Solutions, Inc.
    >>
    >> "Patience is the companion of wisdom."
    >> --Anonymous
    >>
    >>
    >>
    >>
    >
    > ____________________________________________________________
    > Eigentlich bin ich ganz anders, nur komme ich so selten dazu.
    > ( Ödön von Horváth )
    > ____________________________________________________________
    > http://zach.in.tu-clausthal.de
    > ____________________________________________________________
    >
    >
    >
    >
    >
    >
    >
    >
    >

    Alex Kac - President and Founder
    Web Information Solutions, Inc.

    "I am not young enough to know everything."
    --Oscar Wilde
  • Any ideas on how Live Effects (http://www.omaxmedia.com/) manages to
    spatially transform the live camera input?  It doesn't just overlay like the
    augmented reality apps.

    Thanks,  - Conal

    On Tue, Dec 15, 2009 at 1:23 PM, Mark Woollard <mark.woollard...>wrote:

    > They place UI elements over the video feed provided on screen by the built
    > in video capture window, but they don't have access to the actual video
    > data.
    >
    > Mark
    >
    > On 15 Dec 2009, at 16:04, Gabriel Zachmann wrote:
    >
    >>> There is not currently API for this. The API allows you to place
    > overlays on the screen, but video data is not delivered to your app until
    > the user is finished recording.
    >>
    >> So, how do all the so-called augmented reality apps do it?
    >>
    >> Best regards,
    >> Gabriel.
    >
    >
  • You can get the video data now using UIGetScreenImage, though it's not
    the lightest on the battery.

    --Kyle Sluder

    On Thu, Feb 4, 2010 at 5:53 PM, Conal Elliott <conal...> wrote:
    > Any ideas on how Live Effects (http://www.omaxmedia.com/) manages to
    > spatially transform the live camera input?  It doesn't just overlay like the
    > augmented reality apps.
    >
    > Thanks,  - Conal
    >
    > On Tue, Dec 15, 2009 at 1:23 PM, Mark Woollard <mark.woollard...>wrote:
    >
    >> They place UI elements over the video feed provided on screen by the built
    >> in video capture window, but they don't have access to the actual video
    >> data.
    >>
    >> Mark
    >>
    >> On 15 Dec 2009, at 16:04, Gabriel Zachmann wrote:
    >>
    >>>> There is not currently API for this. The API allows you to place
    >> overlays on the screen, but video data is not delivered to your app until
    >> the user is finished recording.
    >>>
    >>> So, how do all the so-called augmented reality apps do it?
    >>>
    >>> Best regards,
    >>> Gabriel.
    >>
    >>

    >
  • I haven't seen the app, but the simplest way to transform the camera input is via the cameraViewTransform property on UIImagePickerController available in 3.1

    Luke

    On Feb 4, 2010, at 8:48 PM, Kyle Sluder wrote:

    > You can get the video data now using UIGetScreenImage, though it's not
    > the lightest on the battery.
    >
    > --Kyle Sluder
    >
    > On Thu, Feb 4, 2010 at 5:53 PM, Conal Elliott <conal...> wrote:
    >> Any ideas on how Live Effects (http://www.omaxmedia.com/) manages to
    >> spatially transform the live camera input?  It doesn't just overlay like the
    >> augmented reality apps.
    >>
    >> Thanks,  - Conal
    >>
    >> On Tue, Dec 15, 2009 at 1:23 PM, Mark Woollard <mark.woollard...>wrote:
    >>
    >>> They place UI elements over the video feed provided on screen by the built
    >>> in video capture window, but they don't have access to the actual video
    >>> data.
    >>>
    >>> Mark
    >>>
    >>> On 15 Dec 2009, at 16:04, Gabriel Zachmann wrote:
    >>>
    >>>>> There is not currently API for this. The API allows you to place
    >>> overlays on the screen, but video data is not delivered to your app until
    >>> the user is finished recording.
    >>>>
    >>>> So, how do all the so-called augmented reality apps do it?
    >>>>
    >>>> Best regards,
    >>>> Gabriel.
    >>>
    >>>

    >>

  • Hi Kyle,

    Doesn't UIGetScreenImage get the screen image, rather than the camera's
    input stream?  If the camera input were being displayed directly, without
    warping, I could see using UIGetScreenImage to get access.  In the Live
    Effects app, I only see post-warped image input, never pre-warped.

    Still puzzled,

      - Conal

    On Thu, Feb 4, 2010 at 8:48 PM, Kyle Sluder <kyle.sluder...> wrote:

    > You can get the video data now using UIGetScreenImage, though it's not
    > the lightest on the battery.
    >
    > --Kyle Sluder
    >
    > On Thu, Feb 4, 2010 at 5:53 PM, Conal Elliott <conal...> wrote:
    >> Any ideas on how Live Effects (http://www.omaxmedia.com/) manages to
    >> spatially transform the live camera input?  It doesn't just overlay like
    > the
    >> augmented reality apps.
    >>
    >> Thanks,  - Conal
    >>
    >> On Tue, Dec 15, 2009 at 1:23 PM, Mark Woollard <mark.woollard...>
    >> wrote:
    >>
    >>> They place UI elements over the video feed provided on screen by the
    > built
    >>> in video capture window, but they don't have access to the actual video
    >>> data.
    >>>
    >>> Mark
    >>>
    >>> On 15 Dec 2009, at 16:04, Gabriel Zachmann wrote:
    >>>
    >>>>> There is not currently API for this. The API allows you to place
    >>> overlays on the screen, but video data is not delivered to your app
    > until
    >>> the user is finished recording.
    >>>>
    >>>> So, how do all the so-called augmented reality apps do it?
    >>>>
    >>>> Best regards,
    >>>> Gabriel.
    >>>
    >>>

    >>
    >
  • Thanks Luke.  I'm stumped about how affine transforms could account for
    these effects, particularly bubble, pinch & swirl.  Also, iiuc,
    UIImagePickerController doesn't give access to the camera input stream --
    just an image when the user actually takes a picture.  There's a new
    takePicture method, but I don't think it can be called frequently.

      - Conal

    On Thu, Feb 4, 2010 at 8:51 PM, Luke the Hiesterman <luketheh...>wrote:

    > I haven't seen the app, but the simplest way to transform the camera input
    > is via the cameraViewTransform property on UIImagePickerController available
    > in 3.1
    >
    > Luke
    >
    > On Feb 4, 2010, at 8:48 PM, Kyle Sluder wrote:
    >
    >> You can get the video data now using UIGetScreenImage, though it's not
    >> the lightest on the battery.
    >>
    >> --Kyle Sluder
    >>
    >> On Thu, Feb 4, 2010 at 5:53 PM, Conal Elliott <conal...> wrote:
    >>> Any ideas on how Live Effects (http://www.omaxmedia.com/) manages to
    >>> spatially transform the live camera input?  It doesn't just overlay like
    > the
    >>> augmented reality apps.
    >>>
    >>> Thanks,  - Conal
    >>>
    >>> On Tue, Dec 15, 2009 at 1:23 PM, Mark Woollard <mark.woollard...>
    >> wrote:
    >>>
    >>>> They place UI elements over the video feed provided on screen by the
    > built
    >>>> in video capture window, but they don't have access to the actual video
    >>>> data.
    >>>>
    >>>> Mark
    >>>>
    >>>> On 15 Dec 2009, at 16:04, Gabriel Zachmann wrote:
    >>>>
    >>>>>> There is not currently API for this. The API allows you to place
    >>>> overlays on the screen, but video data is not delivered to your app
    > until
    >>>> the user is finished recording.
    >>>>>
    >>>>> So, how do all the so-called augmented reality apps do it?
    >>>>>
    >>>>> Best regards,
    >>>>> Gabriel.
    >>>>
    >>>>

    >>>

    >
    >
  • I know I'm replying to a two-and-a-half-year-old thread, but I'm curious…

    On 15 Dec 2009, at 5:14 PM, Luke the Hiesterman wrote:

    > On Dec 15, 2009, at 3:11 PM, Gabriel Zachmann wrote:
    >
    >> According to this
    >> http://www.tuaw.com/2009/12/14/app-store-approved-app-brings-video-recordin
    g-to-iphone-3g-and-1/

    >> they use the non-documented UIGetScreenImage().
    >>
    >> I'd love to use that if it's the only way -- if only somebody could provide me with some simple sample code how to use it .
    >
    > while (youWantImages)
    > {
    > CGImageRef image = UIGetScreenImage();
    > //process image here
    > CGImageRelease(image); //even though this is a Get function, it returns a retained image.
    > }
    >
    > Luke

    As far as I can tell, UIGetScreenImage() remains unpublished. What's the status? Will it still get past review?

    — F
  • On 25 May 2012, at 16:28, Fritz Anderson wrote:

    > I know I'm replying to a two-and-a-half-year-old thread, but I'm curious…
    >
    > On 15 Dec 2009, at 5:14 PM, Luke the Hiesterman wrote:
    >
    >> On Dec 15, 2009, at 3:11 PM, Gabriel Zachmann wrote:
    >>
    >>> According to this
    >>> http://www.tuaw.com/2009/12/14/app-store-approved-app-brings-video-recordin
    g-to-iphone-3g-and-1/

    >>> they use the non-documented UIGetScreenImage().
    >>>
    >>> I'd love to use that if it's the only way -- if only somebody could provide me with some simple sample code how to use it .
    >>
    >> while (youWantImages)
    >> {
    >> CGImageRef image = UIGetScreenImage();
    >> //process image here
    >> CGImageRelease(image); //even though this is a Get function, it returns a retained image.
    >> }
    >>
    >> Luke
    >
    > As far as I can tell, UIGetScreenImage() remains unpublished. What's the status? Will it still get past review?
    >
    > — F

    The e-mail below would suggest no:

    Regards, Rob.

    >> Thank you for submitting XXX App Store.
    >>
    >> XXX cannot be posted to the App Store because it is using private or undocumented APIs:
    >>
    >> Private Symbol References
    >> UIGetScreenImage
    >> As you know, as outlined in the iPhone Developer Program License Agreement section 3.3.1, the use of non-public APIs is not permitted. Before your application can be reviewed by the App Review Team, please resolve this issue and upload a new binary to iTunes Connect.
    >>
    >> Sincerely,
    >>
    >> iPhone Developer Program
    >>
previous month december 2009 next month
MTWTFSS
  1 2 3 4 5 6
7 8 9 10 11 12 13
14 15 16 17 18 19 20
21 22 23 24 25 26 27
28 29 30 31      
Go to today