Tips for installing CUDA into a rocker docker container


Hi there,

This seems like something that people must be doing but I have been struggling with it. The use case is to make use of my GPU for some P-MCMC and machine learning (Xgboost).

So far I have got to the stage of having nvidia-docker ( working on my system (the example nvidia-smi container works fine).

Most of the suggested solutions I have seen recommended basing the container image on one of the nvidia supplied cuda images. The others recommend a manual install but I have been struggling to get this working. I would much prefer sticking with the rocker base image so any ideas would be great.




I am way behind on a project to have a GPU-enabled rocker container. There are two approaches: First, build rocker on top of the NVIDIA images, which is easier for those who know the Rocker stack well. Example:

Alternatively, install NVIDIA software into the Docker image. This beast of a Dockerfile installs the NVIDIA packages and GPU-enabled versions of tensorflow and xgboost (along with a ton of other things - it’s our group’s common working image, adopted for GPU). I’ve neglected it for a while and I’m not 100% it works in its current condition, but it was working at some point :grimacing::


Whoops, I just realized I copied the same link twice. This is the one that builds rocker on top of the NVIDIA base:


Awesome thanks for this Noam,

Not knowing the rocker stack that well I went with the second approach.

Did some minor adjustments to the Dockerfile you linked too (which is all in working order by the way) but rebased to rocker/tidyverse.

If it helps anyone my modified version is here (currently not working with h2o):

Very much looking forward to an out the box GPU enabled rocker container! Just got my hands on a GPU - hopefully, it is worth the setup effort.




Glad it helped and good to know it’s functioning!

Using the GPU backend in h2o.xgboost in a rocker based Docker container