On AIX operating systems the concurrent submission of one hundred or more jobs on the same agent can result in a core dump or in a resource temporarily unavailable message
On AIX operating systems, the concurrent submission of one hundred or more jobs on the same agent can result in a memory dump or in a resource temporarily unavailable message
On AIX operating systems if you submit concurrently one hundred
or more jobs on the same agent you can receive a core memory dump
or the following message:
resource temporarily unavailable
Cause and solution:
This problem is due to insufficient memory and the process number
per user allocated to run the jobs concurrently. To solve this problem,
verify the value of the following configuration settings and change
them as follows:
- Ulimit settings
- The submission of a significant number of Java jobs requires a
large amount of memory. Change the value for data, stack, and memory
limits according to the number of jobs you want to submit. The submission
of a significant number of native jobs requires a high number of file
descriptors and processes. Change the values for nofiles and processes
according to the number of jobs you want to submit. The following
example gives possible setting values to submit 100 jobs concurrently:
time(seconds) unlimited file(blocks) 2097151 data(kbytes) 131072 stack(kbytes) 32768 memory(kbytes) 32768 coredump(blocks) 2097151 nofiles(descriptors) 4000 threads(per process) unlimited processes(per user) unlimited
- Process number per user
- To submit a high number of jobs concurrently you must have a high
value for the maxuproc setting. Use the lsattr -E -l sys0
-a maxuproc command to verify the number of concurrent processes
that a user can create. Use the chdev -l sys0 -a maxuproc=<value> command
to change the value for the maxuproc setting. For example,
to submit 100 jobs concurrently use the following command:
chdev -l sys0 -a maxuproc=500