Support Article
SocketException: Too many open files
Summary
User is seeing multiple socket timeout ERRORS in the logs and causing poor performance on the respective nodes which are configured on Linux OS.
Error Messages
Dec 14, 2015 1:31:38 AM org.apache.tomcat.util.net.JIoEndpoint$Acceptor run
SEVERE: Socket accept failed
java.net.SocketException: Too many open files
at java.net.PlainSocketImpl.socketAccept(Native Method)
at java.net.AbstractPlainSocketImpl.accept(AbstractPlainSocketImpl.java:398)
at java.net.ServerSocket.implAccept(ServerSocket.java:530)
at sun.security.ssl.SSLServerSocketImpl.accept(SSLServerSocketImpl.java:317)
at org.apache.tomcat.util.net.jsse.JSSESocketFactory.acceptSocket(JSSESocketFactory.java:178)
at org.apache.tomcat.util.net.JIoEndpoint$Acceptor.run(JIoEndpoint.java:352)
at java.lang.Thread.run(Thread.java:724)
Steps to Reproduce
Not applicable
Root Cause
Too many open files issue on Linux OS was witnessed as the file descriptors set limit was occupied by ‘Can’t identify protocol errors’ which was due to JDK bug. Also there was no restart done for a quite long time which is also one of the cause for this particular incident.
Resolution
Make the following change to the operating environment:
Though there are few hypotheses, could not establish the exact root cause of the events that had caused the file descriptors to reach its limit.
The following actions have been completed to ensure that this issue will not occur in the future.
* Increased the file descriptor limit from 1024 to 65536.
* A monitoring script is in place to monitor the file descriptors and alert the operations teams incase it is reaching a threshold of 85%.
* Reduced the file operations by disabling the requestor passivation and data indexing.
* As a best practice, suggested user to upgrade JDK to the latest version.
Published February 26, 2016 - Updated October 8, 2020
Have a question? Get answers now.
Visit the Collaboration Center to ask questions, engage in discussions, share ideas, and help others.