Details
-
Bug
-
Resolution: Answered
-
Medium
-
None
-
2.5.0, 2.5.1, 2.5.2, 2.5.3
-
None
-
Standalone
java version "1.6.0_51"
Java(TM) SE Runtime Environment (build 1.6.0_51-b11-456-10M4508)
Java HotSpot(TM) 64-Bit Server VM (build 20.51-b01-456, mixed mode)
Description
After some time, without important load, Stash crash with Too many open files errors in catalina.out (you can download a complete version of the log file here: http://extranet.ttree.ch/~dfeyer/catalina.out.gz)
Stash run on OSX 10.6 / JAVA 6
- uname -a
Darwin everest.intranet.ttree.ch 10.8.0 Darwin Kernel Version 10.8.0: Tue Jun 7 16:33:36 PDT 2011; root:xnu-1504.15.3~1/RELEASE_I386 i386
I try to increase the ulimit values, but without any success, stash continue to crash:
- ulimit -a
-t: cpu time (seconds) unlimited
-f: file size (blocks) unlimited
-d: data seg size (kbytes) unlimited
-s: stack size (kbytes) 8192
-c: core file size (blocks) 0
-v: address space (kb) unlimited
-l: locked-in-memory size (kb) unlimited
-u: processes 2000
-n: file descriptors 32768
Does Stash need more thant 32768 open files ? We dont host more than 40 repository. Some projects are big, but mainly average size project. We have a bit less than 3 Go of data in our Stash home directory.
Thanks for your help
Attachments
Issue Links
- is related to
-
BSERV-3973 Hook callback sockets are not closed correctly
- Closed