Dear organizers,
A recent docker submission failed and the error is likely due to insufficient shm-size. When I run the container locally, using the extra argument in docker run --shm-size=16g solves the same issue.
Would it be possible to run the containers with increased shm-size?
Thanks for your help!
Best regards,
Adversary
Created by Jan Nikolas Morshuis Adversary Thanks for looking into the issue!
Even though the shm-size is increased, I still get the same error that I get locally, when I do not use the --shm-size=16g option in the docker run command. When I do use the option locally, the error disappears.
This is a part of my logs:
```
File "/usr/lib/python3.10/multiprocessing/managers.py", line 817, in _callmethod
conn.send((self._id, methodname, args, kwds))
File "/usr/lib/python3.10/multiprocessing/connection.py", line 206, in send
self._send_bytes(_ForkingPickler.dumps(obj))
File "/usr/lib/python3.10/multiprocessing/reduction.py", line 51, in dumps
cls(buf, protocol).dump(obj)
File "/usr/local/lib/python3.10/dist-packages/torch/multiprocessing/reductions.py", line 618, in reduce_storage
fd, size = storage._share_fd_cpu_()
File "/usr/local/lib/python3.10/dist-packages/torch/storage.py", line 447, in wrapper
return fn(self, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/torch/storage.py", line 522, in _share_fd_cpu_
return super()._share_fd_cpu_(*args, **kwargs)
RuntimeError: unable to write to file : No space left on device (28)
```
Thanks for your help!
Best regards,
Adversary Hi @Adversary,
We have increased the shm-size. Please try to submit again and let us know if you encounter an issue.
Thanks,
The LISA 2025 Challenge Organizers
Drop files to upload
Request for increased --shm-size for Docker container submissions page is loading…