Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error: EMFILE: too many open files #4

Open
RainingChain opened this issue Apr 25, 2018 · 6 comments
Open

Error: EMFILE: too many open files #4

RainingChain opened this issue Apr 25, 2018 · 6 comments
Assignees
Labels

Comments

@RainingChain
Copy link

I tried running file-matcher on a large project and I got the error:

   (node:12672) UnhandledPromiseRejectionWarning: Unhandled promise rejection (rejection id: 1): Error: EMFILE: too many open files, open 'C:\sessions\XXXXXX.cpp'
   (node:12672) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
@mauriciovigolo
Copy link
Owner

Hi @RainingChain

I was searching about this issue and it seams to be a known issue from the node fs module when handling a large number of files. Seems that adding the graceful-fs library is the best option to solve this problem.

I will release a new version with the fix for this problem.

Thanks for reporting!

@RainingChain
Copy link
Author

Ideally, you would call this.readFileContent for 1000 files (number changeable with an option parameter) and you would await until those 1000 files are done before reading the others.

By the way, there's a bug with the following lines. Returning true/false in the promise will not stop the this.files.some() loop.

if (this.fileFilter.content && this.files && this.files.length > 0) {
this.files.some((file, index) => {
this.readFileContent(file)
.then((result) => {
if (result) {
let processed: number = (index + 1) / this.files.length;
self.emit('contentMatch', file, processed);
matchingFiles.push(result);
}
if ((self.files.length - 1) === index) {
resolve(matchingFiles);
return true;
}
}).catch(err => {
reject(err);
return true;
});
return false;
});

@mauriciovigolo
Copy link
Owner

Ideally, you would call this.readFileContent for 1000 files (number changeable with an option parameter) and you would await until those 1000 files are done before reading the others.

I think that is a good solution to handle the situation and avoid to add a new library.

@DavidNiembro
Copy link

Hello, some news about this topic ?

@mauriciovigolo
Copy link
Owner

Hello @DavidNiembro, I will fix it with a similar solution as mentioned by @RainingChain. I will try to release it this week, okay?

@mauriciovigolo mauriciovigolo self-assigned this Sep 3, 2018
@superRaytin
Copy link

I've been troubled by this problem for days. And as this problem has not been solved for a long time. I have added a new library file-content-matcher to solve it. Beyond that, there are some other options:

  • readFileConcurrency the concurrency behavior for reading files. Default is 1000.
  • recursiveDepth the depth while searching recursively from the given path. Default is 0.
  • micromatchOptions extra options used in micromatch.match function.

Hope it helps.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants