I'm getting an exception that says something like "Too many files open." How do I fix this?

When running screen-scraper on Unix-based systems, if several scraping sessions are executing simultaneously, you may see exceptions in log files indicating that too many files are open. The easiest way to fix this is to simply increase the maximum number of allowed open files. Depending on the operating system and/or Linux distribution you're running, there are several ways you might go about this. Here are the ones we've found (note that 65535 refers to the maximum number of open files--you can set it to whatever you'd like):

  • Edit the "/etc/security/limits.conf" file, and add or edit the following lines:
    * soft nofile 1024
    * hard nofile 65535
  • Issue the following command on the command line:
    ulimit -n 65535

    You may also want to put that in a .bashrc file so that it automatically gets executed when you open a new session to the server.

  • Issue the following command on the command line:
    echo 65535 > /proc/sys/fs/file-max
  • Edit the "/etc/sysctl.conf" file, and add or edit the following line:
    fs.file-max=65535
  • In the "/etc/pam.d/su" file, uncomment/remove the # so a line reads:
    session required pam_limits.so
  • In the "/etc/pam.d/common-session" file, ensure you have line that reads:
    session required pam_limits.so

You may have to experiment to find what works. If you ssh into the server, the following command will tell you the current maximum number of open files:

ulimit -n