CURL and Connection limits

  • hi guys.

    I'm currently using libcurl.framework for an app I'm making, and it was a
    bit weird that I get these behaviors.
    First off, I always get error messages saying that I can't connect to the
    server in FTP, and when I do verbose
    logging, it shows that I get the error *after* a change working directory
    command (CWD) inside it. What's more
    interesting is that I also get these errors on the 230th to 234th item,
    that's why a guy from the libcurl mailing
    list told me it could my workspace running out of file handles /
    descriptors. Another thing is that it also affects
    all my NSURLRequest calls, and they also have errors similar to the libcurl
    library I made: can't connect to
    server.

    Surely, I checked if I closed my file handles and curl connections, which
    led me to get away from any
    NSURLRequest instantiation and use libcurl for me to force the closing of a
    connection. Unfortunately, the
    problem still persists.

    Any help or suggestion will be appreciated.
  • I don't claim that our problems are the same, but I offer my
    experiences as a data point.

    This sounds sort of like a problem I've run into in the past when
    connecting to a Yahoo (including Geocites) server. Something about
    the way they're configured makes using CURL to access them hard to do
    properly.

    Performing one operation per event loop seems to help. Presumably,
    when the pools are drained at the end of a run loop, some under the
    hood curl stuff gets cleaned up.

    Another frustrating thing about the Yahoo/Geocities FTP servers is
    that you must space your operations (by "operation" I mean a single
    curl perform call) some small amount of  time (on the order of 3
    seconds or so) apart or the server will reject your login something
    like 25% of the time.

    It's a pain in the butt.

    Sorry for the fuzzy/vague info. It was a while ago, and I don't have
    my sources handy.

    _murat

    On Sep 17, 2007, at 12:17 PM, Jofell Gallardo wrote:

    > I'm currently using libcurl.framework for an app I'm making, and it
    > was a
    > bit weird that I get these behaviors.
    > First off, I always get error messages saying that I can't connect
    > to the
    > server in FTP, and when I do verbose
    > logging, it shows that I get the error *after* a change working
    > directory
    > command (CWD) inside it. What's more
    > interesting is that I also get these errors on the 230th to 234th
    > item,
    > that's why a guy from the libcurl mailing
    > list told me it could my workspace running out of file handles /
    > descriptors. Another thing is that it also affects
    > all my NSURLRequest calls, and they also have errors similar to the
    > libcurl
    > library I made: can't connect to
    > server.
    >
    > Surely, I checked if I closed my file handles and curl connections,
    > which
    > led me to get away from any
    > NSURLRequest instantiation and use libcurl for me to force the
    > closing of a
    > connection. Unfortunately, the
    > problem still persists.
    >
    > Any help or suggestion will be appreciated.
  • It's confirmed guys... even curl connections with S3 does this... so
    basically, when I download
    230-234 items from S3 or FTP, the curl issues show up, e.g. saying that curl
    can't connect to
    the server.

    I'll forward this problem to libcurl, and hope you guys could also help me
    out. This could be a
    Mac-only problem.

    On 9/18/07, Murat Konar <murat...> wrote:
    >
    > I don't claim that our problems are the same, but I offer my
    > experiences as a data point.
    >
    > This sounds sort of like a problem I've run into in the past when
    > connecting to a Yahoo (including Geocites) server. Something about
    > the way they're configured makes using CURL to access them hard to do
    > properly.
    >
    > Performing one operation per event loop seems to help. Presumably,
    > when the pools are drained at the end of a run loop, some under the
    > hood curl stuff gets cleaned up.
    >
    > Another frustrating thing about the Yahoo/Geocities FTP servers is
    > that you must space your operations (by "operation" I mean a single
    > curl perform call) some small amount of  time (on the order of 3
    > seconds or so) apart or the server will reject your login something
    > like 25% of the time.
    >
    > It's a pain in the butt.
    >
    > Sorry for the fuzzy/vague info. It was a while ago, and I don't have
    > my sources handy.
    >
    > _murat
    >
    >
    > On Sep 17, 2007, at 12:17 PM, Jofell Gallardo wrote:
    >
    >> I'm currently using libcurl.framework for an app I'm making, and it
    >> was a
    >> bit weird that I get these behaviors.
    >> First off, I always get error messages saying that I can't connect
    >> to the
    >> server in FTP, and when I do verbose
    >> logging, it shows that I get the error *after* a change working
    >> directory
    >> command (CWD) inside it. What's more
    >> interesting is that I also get these errors on the 230th to 234th
    >> item,
    >> that's why a guy from the libcurl mailing
    >> list told me it could my workspace running out of file handles /
    >> descriptors. Another thing is that it also affects
    >> all my NSURLRequest calls, and they also have errors similar to the
    >> libcurl
    >> library I made: can't connect to
    >> server.
    >>
    >> Surely, I checked if I closed my file handles and curl connections,
    >> which
    >> led me to get away from any
    >> NSURLRequest instantiation and use libcurl for me to force the
    >> closing of a
    >> connection. Unfortunately, the
    >> problem still persists.
    >>
    >> Any help or suggestion will be appreciated.

    >
  • On 9/17/07, Jofell Gallardo <jofell...> wrote:
    > Surely, I checked if I closed my file handles and curl connections, which
    > led me to get away from any
    > NSURLRequest instantiation and use libcurl for me to force the closing of a
    > connection. Unfortunately, the
    > problem still persists.

    One other thing you could try is increasing the limit for your
    process. For example, before doing your CURL work execute something
    like this:

    #include <stdlib.h>
    #include <stdio.h>
    #include <errno.h>
    #include <sys/types.h>
    #include <sys/time.h>
    #include <sys/resource.h>

    int main( int argc, char **argv )
    {
      struct rlimit rl = {0};
      int result = 0;

      result = getrlimit( RLIMIT_NOFILE, &rl );
      if (result != 0)
        printf( "getrlimit = %d\n", errno );
      else {
        printf( "rl.rlim_cur = %d\n", rl.rlim_cur );
        printf( "rl.rlim_max = %d\n", rl.rlim_max );

        rl.rlim_cur = 512;
        result = setrlimit( RLIMIT_NOFILE, &rl );
        if (result != 0)
          printf( "setrlimit = %d\n", errno );
        else
          printf( "setrlimit succeeded\n" );
      }

      return 0;
    }

    If the number of files you can handle suddenly increases to ~ 500 then
    you know you're still leaking handles somewhere.
  • On 9/19/07, stephen joseph butler <stephen.butler...> wrote:
    >
    >
    > One other thing you could try is increasing the limit for your
    > process. For example, before doing your CURL work execute something
    > like this:
    >
    > #include <stdlib.h>
    > #include <stdio.h>
    > #include <errno.h>
    > #include <sys/types.h>
    > #include <sys/time.h>
    > #include <sys/resource.h>
    >
    > int main( int argc, char **argv )
    > {
    > struct rlimit rl = {0};
    > int result = 0;
    >
    > result = getrlimit( RLIMIT_NOFILE, &rl );
    > if (result != 0)
    > printf( "getrlimit = %d\n", errno );
    > else {
    > printf( "rl.rlim_cur = %d\n", rl.rlim_cur );
    > printf( "rl.rlim_max = %d\n", rl.rlim_max );
    >
    > rl.rlim_cur = 512;
    > result = setrlimit( RLIMIT_NOFILE, &rl );
    > if (result != 0)
    > printf( "setrlimit = %d\n", errno );
    > else
    > printf( "setrlimit succeeded\n" );
    > }
    >
    > return 0;
    > }
    >
    > If the number of files you can handle suddenly increases to ~ 500 then
    > you know you're still leaking handles somewhere.

    Thanks Stephen. Am testing this now. One question. Does AppleScript
    somehow could leak file handles?
  • On 9/19/07, Jofell Gallardo <jofell...> wrote:
    > Thanks Stephen. Am testing this now. One question. Does AppleScript
    > somehow could leak file handles?

    Hrmm... I doubt it. Otherwise people would notice.

    Also, in case it wasn't obvious, the get/setrlimit code has to go in
    YOUR executable. What I pasted was just an example of how to use it.
    rlimits are per process settings.
previous month september 2007 next month
MTWTFSS
          1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30
Go to today