-
Notifications
You must be signed in to change notification settings - Fork 30k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
File descriptors not closed properly on OSX #12776
Comments
Can somebody from @nodejs/platform-macos confirm this? Also, it might help us if you could provide some kind of minimal reproduction for this issue. |
I have not run into this. Running macOS 10.12 for about 2 weeks now. I'll try some sample code if some is provided. |
Hmm, @giacomodabisias what Terminal are you using? |
Hi, I will try to produce some code. I am using iterm2; actually it could be that there is an issue with that. just to be sure I will try it with the system terminal |
OK I tried and also terminal has the same issue. |
@galambalazs Do you have a rough estimate on how many console calls it takes to notice anything, and/or how fast those are done? |
@Fishrock123 was i CC'd by mistake? |
I'm using node 7.9 on OS X 10.11.6 (term is zsh). I have an application with extensive logging but haven't noticed issue with FDs as reported by OP. However, I tried a simple loop with console log and it does seem to have a memory growth issue (rapidly rises to about 1.6 GB in process size), as well as getting slower and slower, (and makes the rest of the OS unresponsive), once over 8 million or so messages are logged. The sample code: "use strict";
// Copyright (c) 2017, Sam Thompson (under ISC license).
try {
var num = 1;
while (1) {
console.log('Log: ' + num);
num++;
}
}
catch (err) {
console.error('FATAL ERROR: ' + err.message);
throw (err); // For stack trace.
} However, lsof only shows about a dozen FD's in use, and this number doesn't grow. I assume I'm using lsof correctly. I haven't managed to get it to crash or bail with any error though, so I can't be certain this reproduces the OP's issue, or if it's even the same problem, but it does make my Mac sad.
I thought it might be related to unlimited scroll-back setting, but changing this to 10k makes no difference to the memory effects. HTH; happy to run other tests on Mac; especially if OP can provide sample code. |
@sambthompson @giacomodabisias Can you reproduce the issue with the official binaries from https://nodejs.org/? |
Sorry for muddying the waters here; I must have misread the docs. I understood this was async on Mac. I know there have been some changes around this (e.g. #6816). But according to https://nodejs.org/dist/latest-v7.x/docs/api/process.html#process_a_note_on_process_i_o, it's sync (i.e. isn't Mac OS considered a Unix)? |
@sambthompson The documentation is right that the operation itself is synchronous (the write system call blocks) but node's internal bookkeeping may not be released until the next tick of the event loop. If you replace |
So I think I noticed that it was actually my fault. I will investigate more and let you know soon |
Node 7.9
Osx Sierra 10.12.4
Node installed through Brew
Hello everyone.
I am using node to process several files and I noticed at some point that node was breaking due to to having too many file descriptors in use. I was quite sure that I was correctly handling the files so I started investigating removing code and checking the number of open file descriptors. At the end my code consisted just in a for loop printing a message.
I had a look with lsof node to the open file descriptors and I noticed that every time I invoke console.log() a new file descriptor is created using /dev/ttys001 which is the current shell if I am not wrong.
These file descriptors continue to accumulate until I reach the maximum and then the code brakes.
If I remove the console.log() everything works fine. I tested the same problem on ubuntu 16.04 and there I have no issues. The number of file descriptors is almost constant.
Any idea about this issue?
Thanks a lot
The text was updated successfully, but these errors were encountered: