Skip to main content

Accessing and running tutorial material

Tutorials will be delivered via Jupyter notebooks (download information to follow) which should be run remotely on a GPU-equipped desktop machine managed by the SC RTP. Longer running examples and benchmarking exercises should use the GPU-equipped nodes on the Tinis HPC cluster and will require editing/running code in the terminal, via remote desktop or locally before uploading to the cluster.

To access/run/edit the examples you will need to open a remote terminal on an SC RTP managed Linux machine.

WARNING : The desktop Linux configuration will will be using is still under testing and won't be rolled out to all machines until the end of July. Be prepared to see warning messages about the system still being under development.

Remote access options

Remote desktop via X2Go : For users less familiar with working remotely on Linux machines, we recommend downloading the X2Go client which is available for Windows, Apple MacOS and Linux onto your laptop. This should be configured to connect to brigitte.csc.warwick.ac.uk. Select "MATE" from the session menu and connect using your SC RTP desktop username and password. This will launch a remote desktop window which you can interact with directly as if you were physically sitting in front of one of our managed machines.

Once connected you will need to launch a terminal client. Several are available via the "System Tools" section of the "Applications" menu. You might also want to use the "Places" menu to explore/manage your home filespace on the SC RTP system. If in doubt, ask for help. Further information on the graphical Linux desktop is available via the RSE group web pages.

Access via a Linux laptop: If you have a Linux laptop then you can connect to brigitte.csc.warwick.ac.uk from a terminal window using ssh.

$ ssh marxyz@brigitte.csc.warwick.ac.uk

Replace marxyz with your SC RTP username. Files can be transferred from your laptop to your home space on the SC RTP system using scp or sftp commands as explained here.

You may prefer a graphical sftp client. For example, using the default Ubuntu file manager you can browse to other locations -> connect to server and then enter

sftp://godzilla.csc.warwick.ac.uk

which will navigate to your home directory on the SC RTP system. You can then drag files between windows browsing your local machine and the remote system. Note that your home space is the same on all SC RTP desktops, regardless of which you connect to. Godzilla is the machine we set aside for remote file transfers - it has a faster connection to the campus network than standard desktops, but you must not run code on Godzilla!

Access via a Mac laptop: As with a Linux laptop, you can open a remote terminal session using ssh in the MacOS "Terminal" application, and copy files via scp or sftp. If you prefer a graphical file transfer client then Cyberduck is popular and intuitive.

Access via a Windows laptop: First, you want to go home and rethink your life. Second, if you still have a windows laptop then consider using the X2Go option above. Alternatively, PuTTy is a suitable client for opening a remote terminal on SC RTP Linux machines, and WinSCP is a suitable way to transfer files between your local laptop and the SC RTP desktop computers. PuTTy is restricted to a text-based terminal interface. If you wanted to launch graphical software remotely then MobaXTerm is an easy way to enable this in a single piece of software without needing to install a seperate X server.


Software environment

The SC RTP machines use environment modules so that users can customise their Linux environment for specific tasks/purposes, e.g. these tutorials. You can load the modules we will need with the following incantation from within a terminal on brigitte.csc.warwick.ac.uk or any other managed desktop machine with a modern GPU.

$ module load GCC/6.4.0-2.28 OpenMPI Python OpenBLAS CUDA PyCUDA pygpu numba pyculib Pillow IPython matplotlib

That's a pain to remember, so we can save this collection of modules with an easy-to-remember name like "gpuschool".

$ module save gpuschool

In future sessions you'll be able to recall this selection of modules with the command

$ module restore gpuschool

You should now have an environment based on Python 3.6.3 with additional GPU-enabled libraries ready to use.

Be aware that the GPU nodes on Tinis do not share files/settings with the other SC RTP (desktop) computers. The module system on Tinis is based on the same software, and the modules have the same names, but you will need to load them again once you've connected to those nodes.

Jupyter notebooks

The notebooks for the various tutorials can be obtained from the Warwick RSE github account.

$ git clone https://github.com/WarwickRSE/gpuschool2018
$ cd gpuschool2018

If you are connected via X2Go and using a terminal window via remote desktop, you can then open the Jupyter notebook interface in a browser using

$ jupyter notebook 

and then select the notebook you wish to open.

If you are connecting via ssh and want to launch the notebooks such that you can connect to them from a web browser running locally on your laptop, the incantations are a little more complicated. First set a password for remote access to your Jupyter notebooks.

$ jupyter notebook password 

Make sure this a strong password, and don't forget it. Then lauch the notebook server, specifying that we don't want to automatically launch a browser, and that we want to connect to the machine via its network hostname.

$ jupyter notebook --no-browser --ip=`hostname -f`

This should print a URL to screen (something like http://xxxxxxx.yyy.warwick.ac.uk:nnnn) which you can copy and paste into a web browser running on your local laptop. You will need the password you set in order to connect.

Note it is not possible to connect to remotely running notebooks in this way from off-campus without tunnelling your web traffic via another ssh connection.

Remember to kill the notebook server with ctrl-C when you're done for the day.


Using the GPU nodes on Tinis

The GPU nodes in Tinis are a shared university-wide resource and are normally only accessible via batch queues. For the purposes of these tutorials the batch queues will have been disabled and we can connect directly to two of these nodes and run interactively. Don't get used to that as being the normal way to access this resource - it will stop working after the school is finished!

First you need to connect to the Tinis login node. This wil use the public/private key pair you created when applying for your Tinis account, so make sure you remember the passphrase you entered (if any) when generating those keys.

$ ssh marxyz@tinis.csc.warwick.ac.uk

If you put your private key in a non-standard location then you will need to specify a path to that with the -i option to ssh. At this point you are connected to the Tinis login node, not a GPU node. You must not run code on the login node.

From here you can however connect to a GPU node we have reserved for this school. Use either gpu3 or gpu4 (Tinis has four GPU nodes).

[marxyz@tinis:login1 ~]$ ssh gpu4

Once connected to the gpu nodes, remember to load the modules we'll need as per the above.

[marxyz@gpu4 ~]$ module load GCC/6.4.0-2.28 OpenMPI Python OpenBLAS CUDA PyCUDA pygpu numba pyculib Pillow

Again, you can save this collection of modules on Tinis using the module save command. Note that we haven't loaded IPython or matplotlib - the usual purpose of Tinis is to run unattented (non-interactive) scripts which write their output for later analysis/interpretation. Running notebooks on Tinis is not supported.

For the excercises which use Tinis, you will need to specify which of the four GPUs on the node you wish to use (we'll come up with some way of randomly assigning course participants to GPUs), paste your python code into a file with a .py extension (e.g. myscript.py) and run it on the GPU node directly with (for example).

[marxyz@gpu4 ~]$ export CUDA_VISIBLE_DEVICES=0   # Pick 0-3 to identify which of the 4 GPUs to use
[marxyz@gpu4 ~]$ python myscript.py

You can transfer files to and from Tinis using the same mechanism as you used for moving files between the SC RTP desktop system and your laptop. Ask for help with this if unsure.

Note that for normal usage (outside of these tutorials) we would submit a batch script to the queueing system from the login node, and request a GPU as detailed here. Module commands would be placed in that batch script before invoking python, but we would never specify which GPU to use ourselves as we have above. The system would assign an available GPU automatically.