This seems like something that people must be doing but I have been struggling with it. The use case is to make use of my GPU for some P-MCMC and machine learning (Xgboost).
So far I have got to the stage of having nvidia-docker (https://github.com/NVIDIA/nvidia-docker) working on my system (the example nvidia-smi container works fine).
Most of the suggested solutions I have seen recommended basing the container image on one of the nvidia supplied cuda images. The others recommend a manual install but I have been struggling to get this working. I would much prefer sticking with the rocker base image so any ideas would be great.
I am way behind on a project to have a GPU-enabled rocker container. There are two approaches: First, build rocker on top of the NVIDIA images, which is easier for those who know the Rocker stack well. Example:
Alternatively, install NVIDIA software into the Docker image. This beast of a Dockerfile installs the NVIDIA packages and GPU-enabled versions of tensorflow and xgboost (along with a ton of other things - it’s our group’s common working image, adopted for GPU). I’ve neglected it for a while and I’m not 100% it works in its current condition, but it was working at some point : https://github.com/ecohealthalliance/reservoir/blob/master/Dockerfile.gpu