How to really fix the too many open files problem for Tomcat in Ubuntu

A couple of days ago we ran into the infamous “too many open files” when our Tomcat web server was under load. There are several blogs around the internet that tries to deal with this issue but none of them seemed to do the trick for us. Usually what you do is to set the ulimit to a greater value (it’s something like 1024 by default). But in order to make it permanent after reboot the first thing suggested is to update the /proc/sys/fs/file-max file and increase the value then edit the /etc/security/limits.conf and add the following line * - nofile 2048 (see here for more details). But none of this worked for us. We saw that when doing

cat /proc//limits

the limit was still set to the initial value of 1024:

Limit                     Soft Limit           Hard Limit           Units
Max cpu time              unlimited            unlimited            seconds
Max file size             unlimited            unlimited            bytes
Max data size             unlimited            unlimited            bytes
Max stack size            8388608              unlimited            bytes
Max core file size        0                    unlimited            bytes
Max resident set          unlimited            unlimited            bytes
Max processes             63810                63810                processes
Max open files                1024                     1024                    files
Max locked memory         65536                65536                bytes
Max address space         unlimited            unlimited            bytes
Max file locks            unlimited            unlimited            locks
Max pending signals       63810                63810                signals
Max msgqueue size         819200               819200               bytes
Max nice priority         0                    0
Max realtime priority     0                    0
Max realtime timeout      unlimited            unlimited            us

It was not until we found this thread that the reason and solution became clear. Our Tomcat instance was started as a service during boot and there’s a bug discovered and filed (with patch) in 2005 that doesn’t seem to have been resolved yet. The bug reveals itself by ignoring the max number of open files limit when starting daemons in Ubuntu/Debain. So the work-around suggested by “BOK” was to edit /etc/init.d/tomcat and add:

ulimit -Hn 16384
ulimit -Sn 16384

Finally the max number of open files for Tomcat was increased!

This Post Has 38 Comments

  1. Wojtek Erbetowski

    Nice investigation! After reading the title I was sure you want to show how to solve it by addressing opened files leak :-) I wouldn’t suspect how hard it could be to increase it.

    And thanks for attaching all the links, it reads well. Thanks for sharing! You probably saved some of us a lot of time in the future :-)

  2. Johan Haleby

    Thanks for your comment :)

  3. basit

    I think you should read the comment carefully in /etc/security/limits.conf:
    # – NOTE: group and wildcard limits are not applied to root.
    # To apply a limit to the root user, must be
    # the literal username root.

    so, in /etc/security/limits.conf you should add this line:
    # End of file
    * hard nofile 65535
    * soft nofile 65535
    root hard nofile 65535
    root soft nofile 65535

    because some people say that 65535 is unlimited

  4. Chris


    I just implemented this on our servers, I was curious about one thing though:
    How would this affect the stability of a Ubuntu server running Tomcat? My manager has expressed concerns that Tomcat will proactively open as many file handles as it is allowed, and by setting both the hard and soft limit to such a high number will cause the VM in question to chew up tonnes of memory.
    Obviously I do understand that if a huge amount of traffic hits the site then Tomcat will open alot of files and probably hit it’s memory limit fairly quick, that’s expected. I was just wondering if anyone can shed light on the effect of increasing the open files limit when this many files are not required to be open. Does Tomcat only open files/sockets when required?

    In the meantime, instead of following these instructions to the letter I have added to the init.d file (for alfresco rather than Tomcat in my case, though the alfresco service starts Tomcat anyway):
    ulimit -Hn 16384
    ulimit -Sn 4096
    I am hoping that a lower soft limit will prevent the situations my manager is worried about.

    On a different note, thanks very much for the help! The first time I thought I had ‘Fixed’ this issue until I rebooted the server was incredibly frustrating, and your instructions were vital in a more permanent fix.

  5. Vikram

    Thanks this helped us!!!

  6. lalo_uy

    Many thanks for the tip. you save my site :)

  7. Jacovasaure

    Thx a lot!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

  8. gibffe

    Wow that saved my day. I was running exactly the same setup as I normally would on ec2 and I started getting tmof out of the blue – that fix worked ! Amazon Linux ftw :P

  9. Mike O'Connor

    Just adding to the applause mate :) You saved us.. our build process ground to a halt with “too many open files” and no matter where we tried to set the limit, it always came back as 1024.

  10. sandy


    You have mentioned to add the ulimits in Tomcat (/etc/init.d/tomcat ). I welcome ur suggestion. But I dont know where exactly i need to add those info in my tomcat file?

    under tomcat start() method or globally. Sry I am a newbie and need this for my academic project

    Pls help me out.


  11. Kristen

    Finally, a clean comprehensive description of the problem with a solution that works. THANKS! You saved my new production installation from this problem.


  12. Do you mind if I quote a couple of your posts as long as I provide credit and sources back to your website?
    My blog is in the very same area of interest as yours and
    my visitors would truly benefit from some of the information you
    provide here. Please let me know if this okay with you.

    1. Johan Haleby

      Sure I don’t mind, glad that you found it useful.

  13. Vivek

    Thanks for sweet and short explanation. Just a update guys, bug is still present on Tomcat7 today(September 23rd, 2014).

  14. josant

    Thx a lot, it works

  15. divyang

    I have edited /etc/init.d/tomcat and added those 2 lines. but, it doesn’t changed. I want to know that where to put those 2 lines in this file. in the end or somewhere else. please, give me a suggestion.

    1. Johan Haleby

      This was so long ago that I don’t remember. Try reading the links that I refer to.

    2. Foxi

      Put it juste after the “start )”

      case “$1” in
      ulimit -Hn 4096
      ulimit -Sn 4096

  16. Lloyd

    I’ve tried these fixes and the errors will go away in the short term … the problem is I’m not certain Tomcat is properly closing files. For instance I set a very large limit of files and I don’t see any way Tomcat could possible. Now I’m looking in the java if I’m not closing files correctly or if there are settings in tomcat to close them.

  17. Ikenna

    Hey you need to reboot the server for the changes in /etc/security/limits.conf to take effect

    1. Ian Cervantez

      @Ikenna try this to avoid the server reboot
      sudo sysctl -p

  18. Mark

    Should the ulimit be set to all users individually?

  19. Carlos Valderrama

    Hi! first of all Thank you so much for this info!

    You have literally saved my work!

    I work in a financial institution and We were having this issue under production enviroment, the internet bankig was being affected every short time because of the collapse of the new web services deployed under tomcat…

    We applied all the configurations you suggested and It got stable!

  20. Viral

    Hi Johan,

    Instead of rebooting server to update file-max value you can make changes in /etc/sysctl.conf file with values set fs.file-max=100000 and run command sysctl -p that will reread your /etc/sysctl.conf file and then check cat /proc/sys/fs/file-max value.

  21. Raphaël Lemaitre

    Hi Johan,

    Thanks for this fix.
    I think the workaround should be written in /etc/default/tomcat7 instead of /etc/init.d/tomcat7. By doing this, it won’t be overridden by future upgrade (via apt-get).

  22. Mike

    Thanks for this. I was starting to think I was going insane. The bug in debian is still present today. We keep quite a few temporary files around with long running jobs and this fixed my problem.

  23. Thanks from Brazil!

    you just saved our clients after 3 hours of system down!

  24. Ganesha

    i am always using too many files and i have some problems to open too many files from my system(windows 7), please help me how solve it.

  25. IP

    But it’s only a way to get ride of effects. “My Tomcat/App is broken so we increase limit”. We should fix the problem – too many files open by Tomcat

  26. jey

    Thanks this is the most useful article, I spent weeks on this issue and finally, Issue has been resolved.

  27. Ashutosh Singh

    I am facing this issue continously. I raised hardLimit to 12288 and softLimit to 4096 in limits.conf.
    I have deployed Spring boot application on ec2 instance and not able to find tomcat process id so that i can check its open file value and increase it. (cat /proc//limits) how to use this command for spring boot app.
    Please help me finding solution to this.

  28. Rony

    Hi, thanks for the tips. I created, as I did not have, the tomcat file.
    ulimit -Hn 65000
    ulimit -Sn 65000
    However I’m getting:
    cd proc# cat 2843/limits:
    Max processes 65000 65000 processes
    Max open files 4096 4096 files
    file-max shows 8102020 which is more than the number I need.
    I hope someone has a clue as to why I can’t resolve Max open files, Thanks!

  29. Kim Tiago Baptista

    Thank you!!! We are in 2019 and this still a problem… You saved my life!!!

  30. Ashwin Pai

    What is the impact of more open files on performance. Does more open files impact the resource utilization. Please let me know the best practice to control memory consumption with more number of files. I observe the heap utilization is 92% and 195717 open files

  31. no win no fee

    A round of applause for your blog article.Really thank you! Cool.

  32. Wiley E. Coyote

    i have a similar problem. Although I can increase the ulimit the issue is tomcat is excessively opening files on each requests. With one web-app which has 160 jars I find 10s of thousands open and closes happening and the clearly are undesirable. The problem is clearly either a bug in tomcat or its config. I just haven’t been able to prove which it is.

  33. Denis Riendeau

    Well, it’s 2021, and I’m running into this problem…
    I’m on Red Hat Enterprise Linux Server release 7.9 (Maipo).
    Adding the ulimit line in a file called /etc/init.d/tcserver does not work (tcserver is the user account under which the java runs).
    tcserver@server_name:~ $ cat /proc/1460/limits|grep open
    Max open files 4096 4096 files
    tcserver@dlsdnap00009143:~ $ ulimit -n

Leave a Reply