Wiki source code of Creating a custom Jupyter Lab GPU enabled tensorflow image
Last modified by Jan Rhebergen on 2022/01/24 15:56
Hide last authors
| author | version | line-number | content |
|---|---|---|---|
![]() |
1.1 | 1 | This page descirbes the steps needed to produce a ##tensorman## image that has all the elements of an enviornment that uses the GPU and also has a suitable ##conda## (% class="mark" %)Jupyter(%%) enviroment. |
| 2 | |||
| 3 | = Steps = | ||
| 4 | |||
| 5 | (% class="box" %) | ||
| 6 | {{{sudo tensorman run -p 8889:8889 --root --gpu --python3 --jupyter --name jan bash | ||
| 7 | }}} | ||
| 8 | |||
| 9 | The port specified should be appropriately mapped to the outside using the nginxproxymanager. This specific port 8889 is assigned to user jan. Eventually the container will be visible (acessible) from [[https:~~/~~/jupyter-jan.informeer.de>>https://jupyter-jan.informeer.de]] | ||
| 10 | |||
| 11 | Inside the running instance do the following: | ||
| 12 | |||
| 13 | (% class="box" %) | ||
| 14 | {{{apt update | ||
| 15 | apt install git | ||
| 16 | }}} | ||
| 17 | |||
| 18 | We need (want) to have git available later because we want to install ##git## support in ##jupyter-lab## hence git itself also needs to be available. | ||
| 19 | |||
| 20 | Apart from plain ##jupyter## notebook we also want (actually prefer) ##jupyter lab##. The ##tensorflow## image has ##jupyter## installed by means of ##pip## hence we adopt the same procedure. After all this is a separate environment inside a ##docker## image. We can still use ##conda## for our actual working environments and packages. This is because the ##cwd## is mounted under the ##/project## volume. | ||
| 21 | |||
| 22 | |||
| 23 | (% class="box" %) | ||
| 24 | {{{pip install jupyterlab jupyterlab-git | ||
| 25 | }}} | ||
| 26 | |||
| 27 | Then from another terminal/console this: | ||
| 28 | |||
| 29 | (% class="box" %) | ||
| 30 | {{{tensorman save jan jupyter-lab-gpu | ||
| 31 | }}} | ||
| 32 | |||
| 33 | The first ##jan## is the container name and the second ##jupyter-gpu## is the image name (dfferent names can be chosen of course). This saves the container and makes the changes (installing ##git##) persistent. Next we start the container as a regular user (has to be in ##docker## group though). The ##~-~-name $USER## part is added to be able to identify it when multiples (different) instances are present. | ||
| 34 | |||
| 35 | When starting ##tensorman## it will match the current working directory to the ##tensorman (docker)## /##project## volume. If you choose to stare in ##$HOME## this may cause interference with local configurations i.e. outside of ##tensorman (docker)##. To prevent this it is advised to start in a subdirectory of ##$HOME## i.e. ##$HOME/project## (although it can be any name). All subsequent actions from within the ##tensorman## ##docker## instance will use this as its ##$HOME##. This means you can make a fresh ##conda## install without affecting the one in the hosts ##$HOME##. | ||
| 36 | |||
| 37 | |||
| 38 | (% class="box" %) | ||
| 39 | {{{mkdir $HOME/project | ||
| 40 | cd $HOME/project | ||
| 41 | tensorman =jupyter-lab-gpu run -p 8889:8889 --gpu --python3 --jupyter --name $USER bash | ||
| 42 | }}} | ||
| 43 | |||
| 44 | The ##~-~-python## and ##~-~-jupyter## may not be needed as they have already previously been incorporated in the saved image ##jupter-gpu##. | ||
| 45 | |||
| 46 | Next we install conda in Tensorflow. | ||
| 47 | |||
| 48 | (% class="box" %) | ||
| 49 | {{{pip install conda | ||
| 50 | conda install numpy | ||
| 51 | conda install pandas | ||
| 52 | }}} | ||
| 53 | |||
| 54 | We clone the tensorflow doc git repos because it hold a few examples that we can use to test the setup: | ||
| 55 | |||
| 56 | (% class="box" %) | ||
| 57 | {{{mkdir tensorflow | ||
| 58 | cd tensorflow | ||
| 59 | git clone https://github.com/tensorflow/docs.git | ||
| 60 | }}} | ||
| 61 | |||
| 62 | To be able to demonstrate the ##tensorflow## examples we install the following ##conda## packages: | ||
| 63 | |||
| 64 | (% class="box" %) | ||
| 65 | {{{conda install keras-gpu | ||
| 66 | conda install matplotlib | ||
| 67 | }}} | ||
| 68 | |||
| 69 | Finally from the tensorman image bash prompt we excute: | ||
| 70 | |||
| 71 | (% class="box" %) | ||
| 72 | ((( | ||
| 73 | {{{jupyter lab --ip=0.0.0.0 --port=8889 --no-browser }}} | ||
| 74 | ))) | ||
| 75 | |||
| 76 | Now we can open the browser at https://jupyter-jan.informeer.de (or whatever we configured in the nginxproxymanager). | ||
| 77 | |||
| 78 | https://jupyter-jan.informeer.de/lab/tree/tensorflow/docs/site/en/r1/tutorials/keras/basic_classification.ipynb | ||
| 79 | |||
| 80 | This demonstrates a canonical image classification problem. A basic test can be executed using this notebook: | ||
| 81 | |||
| 82 | https://jupyter-jan.informeer.de/lab/tree/source/tests/test-gpu.ipynb | ||
| 83 | |||
| 84 | |||
| 85 | (% style="text-align:center" %) | ||
| 86 | [[image:Screenshot from 2021-05-28 18-15-06.png||alt="the newly created container show in portainer"]] | ||
| 87 | |||
| 88 | [[image:Screenshot from 2021-05-28 18-15-21.png||alt="output from test notebook demonstrating GPU detection"]] |
