Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

memory leak when using XFERINFOFUNCTION #381

Open
tobiasklevenz opened this issue Jan 31, 2023 · 0 comments
Open

memory leak when using XFERINFOFUNCTION #381

tobiasklevenz opened this issue Jan 31, 2023 · 0 comments
Assignees

Comments

@tobiasklevenz
Copy link

Hey there, we noticed a memory leak when using XFERINFOFUNCTION.

We use the following code in a worker process running in an ec2 instance that receives a job with a url from a queue and fetches it:

function curl(
  url: string
): Promise<{ status: number; text: string; ok: boolean }> {
  return new Promise((resolve, reject) => {
    const curl = new Curl();

    curl.setOpt(Curl.option.URL, url);
    curl.setOpt(Curl.option.ACCEPT_ENCODING, "gzip,deflate");
    curl.setOpt(Curl.option.FOLLOWLOCATION, 1);

    curl.setOpt(Curl.option.NOPROGRESS, 0);
    // abort request if larger then MAX_DL_SIZE
    curl.setOpt(Curl.option.XFERINFOFUNCTION, (dltotal) => {
      return dltotal >= MAX_DL_SIZE ? 1 : 0;
    });

    curl.on("end", (status, data) => {
      curl.close();
      resolve({
        status,
        text: data.toString(),
        ok: status >= 200 && status < 300,
      });
    });

    curl.on("error", (e, curlCode) => {
      curl.close();
      reject(
        curlCode === CurlCode.CURLE_ABORTED_BY_CALLBACK
          ? new Error(`dl bigger than ${MAX_DL_SIZE}`)
          : e
      );
    });

    curl.perform();
  });
}

Once we introduced the XFERINFOFUNCTION code to limit the download size of the request, we saw a significant increase in memory usage, which over the course of a few hours completely maxed out our instances memory. We tried adding curl.enable(CurlFeature.Raw | CurlFeature.NoStorage) and handling the received data in a separate writefunction but got the same results.

To fix the issue we switched to implementing WRITEFUNCTION and handle the dl size limit there:

    curl.setOpt(Curl.option.WRITEFUNCTION, (chunk) => {
      data = Buffer.concat([data, chunk]);

      if (data.length >= MAX_DL_SIZE) {
        return 0;
      }

      return chunk.length;
    });

Here's a screenshot of our memory usage showing before and after we switched to the writefunction:

image

@JCMais JCMais self-assigned this Feb 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants