Issue with Python multiprocessing in Alpine container
I ran into an issue with Python on Alpine. First I thought it was related to limits on nof but that's not the case. I can have many files open in the container (tested up to 15k). On the host OS it is running fine. Tried older versions but these show the same behaviour.
Steps to reproduce:
docker run -it alpine:3.19 /bin/sh
apk update && apk upgrade
apk add --no-cache py3-pip
cat <<EOF > test.py
import multiprocessing
import time
import subprocess
if __name__ == '__main__':
queues = []
print((subprocess.check_output(['cat', '/proc/1/limits'])).decode())
for i in range(1000):
print(f"Appending multiprocessing.Queue() n. {i} ", end="")
queues.append(multiprocessing.Queue())
print("ok")
time.sleep(0.1)
print("all ok")
EOF
python3 test.py
Error
Appending multiprocessing.Queue() n. 84 ok
Appending multiprocessing.Queue() n. 85 Traceback (most recent call last):
File "test.py", line 10, in <module>
queues.append(multiprocessing.Queue())
File "/usr/lib/python3.7/multiprocessing/context.py", line 102, in Queue
return Queue(maxsize, ctx=self.get_context())
File "/usr/lib/python3.7/multiprocessing/queues.py", line 47, in __init__
self._wlock = ctx.Lock()
File "/usr/lib/python3.7/multiprocessing/context.py", line 67, in Lock
return Lock(ctx=self.get_context())
File "/usr/lib/python3.7/multiprocessing/synchronize.py", line 162, in __init__
SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx)
File "/usr/lib/python3.7/multiprocessing/synchronize.py", line 59, in __init__
unlink_now)
OSError: [Errno 24] No file descriptors available
Any suggestions?