First of all, you need a resarch user to access the cluster. It’s important to remember that to access from outside the campus, a VPN connection must be used for security reasons. Here you have the instructions to request the research user and how to configure the VPN.
To ensure a secure login session, users must connect to the login node using the secure shell SSH protocol.
Before any login sessions can be initiated using ssh, a working SSH client needs to be present in the local machine. Wikipedia is a good source of information on SSH in general and provides information on the various clients available for your particular operating system, but here we'll cover common examples for Windows 7, Ubuntu Linux and Mac OS Mavericks.
You must remember that the login node is not a calculation node. In order to load modules and execute your applications, you must connect to any of the calculation nodes through interactive or salloc command (ssh to nodes have been disabled to avoid the interactive use of them without going through slurm)
test@login01:~$ module load MATLAB/2017a
-bash: module: command not found
test@login01:~$ interactive
salloc: Granted job allocation 391355
test@node001:~$ module load MATLAB/2017a
test@node001:~$ module list
Currently Loaded Modules:
1) Java/1.8.0_121 2) MATLAB/2017a
When you are connected in the login node, you can send jobs to the scheduler through the "sbatch" command. What is inside the script will be executed in the node assigned by the scheduler, not in the login node.
test@login01:~/Scripts/Matlab$ sbatch matlab.sh Submitted batch job 391235 test@login01:~/Scripts/Matlab$ squeue -u test JOBID PARTITION NAME USER ST TIME NODES NODELIST(REASON) 391235 short matlab_t test R 0:09 1 node007
You need to install on your workstation a set of free tools to interact with the cluster:
www.putty.org
Purpose: Free SSH client
http://www.netsarang.com/download/free_license.html
Purpose: Free SSH client
In this tutorial, we'll work with Putty and XShell.
Putty:
Open a ssh connection to the server hpc.s.upf.edu. The first time, you'll geta warning for the SSL signature key coming from hpc.s. Accept it permanently to avoid this window coming out every time:
Use your research username and password to get access to the server.
login as: test test@hpc.s.upf.edu's password: Last login: Tue Dec 12 19:39:51 2017 from 10.60.84.200 test@login01:~$
Connecting to SNOW from a linux machine is pretty easy, as everything is built in by default in every distribution. Simply open a terminal and type 'ssh -X username@hpc.s.upf.edu', where:
username is your research user, in the usual form userrname
-X tells your Linux client to forward and manage the graphical X sessions you eventually need to open over hpc.s.upf.edu
Simply using the terminal application we can connect to the slurm cluster: