NEMO2/Login

From bwHPC Wiki
Jump to navigation Jump to search
Attention.svg

Access to NEMO2 is limited to IP addresses from the BelWü network. All home institutions of our current users are connected to BelWü, so if you are on your campus network (e.g. in your office or on the campus WiFi) you should be able to connect to NEMO2 without restrictions. If you are outside one of the BelWü networks (e.g. at home), a VPN connection to the home institution or a connection to an SSH jump host at the home institution must be established first.

The login nodes of the bwHPC clusters are the access point to the compute system, your $HOME directory and your workspaces. All users must log in through these nodes to submit jobs to the cluster.

Prerequisites for successful login:

You need to have

Attention.svg

The NEMO2 registration continues to use the old registration page https://bwservices.uni-freiburg.de, while the second factor (2FA) must be configured on the new registration page https://login.bwidm.de. https://login.bwidm.de is not used for anything else so far!


Login to bwForCluster NEMO 2

Login to bwForCluster NEMO 2 is only possible with a Secure Shell (SSH) client for which you must know your username on the cluster and the hostname of the login nodes. For more general information on SSH clients, visit the SSH clients Guide.

Username

If you want to use the NEMO2 cluster you need to add a prefix to your local username. For prefixes please refer to the Username Wiki.

Example:

  • If your local username for the University is vwxyz1234 and you are a user from the University of Freiburg this would combine to: fr_vwxyz1234.


Hostnames

The system has two login nodes. You have to select the login node yourself.

Hostname Node type
nemo2-login.nemo.uni-freiburg.de NEMO2 first or second login node
nemo2-login1.nemo.uni-freiburg.de NEMO2 first login node
nemo2-login2.nemo.uni-freiburg.de NEMO2 second login node


Login with SSH command (Linux, Mac, Windows)

Most Unix and Unix-like operating systems such as Linux, Mac OS, and *BSD, as well as newer versions of MS Windows 10 and 11, have a built-in SSH client provided by the OpenSSH project. If you want to use the full set of Linux commands on Windows, you can also install the Windows Subsystem for Linux (WSL) on newer Windows 10 and 11 versions very easily. Then start a terminal, e.g. "xterm", "konsole" or "gnome-terminal" under Linux, the "Terminal" app under Mac OS or under Windows the "Terminal" app (command: 'cmd') or "PowerShell".

For login use one of the following ssh commands:

ssh <username>@nemo2-login.nemo.uni-freiburg.de
ssh -l <username> nemo2-login.nemo.uni-freiburg.de

To run graphical applications, you can use the -X or -Y flag to ssh:

ssh -Y -l <username> nemo2-login.nemo.uni-freiburg.de

For better graphical performance, we recommend using the VNC module on the cluster: 'module load vis/turbovnc'.

Login with graphical SSH client (Windows)

For Windows we suggest using MobaXterm for login and file transfer.

Start MobaXterm, fill in the following fields:

Remote name              : nemo2-login.nemo.uni-freiburg.de    # or nemo2-login1.nemo.uni-freiburg.de, nemo2-login2.nemo.uni-freiburg.de
Specify user name        : <username>
Port                     : 22

After that click on 'ok'. Then a terminal will be opened and there you can enter your credentials.

Login Example

To log in to NEMO2, you must provide your service password. Proceed as follows:

  1. Use SSH for a login node.
  2. When you log in for the first time, you must confirm the SSH fingerprint of the login node.
  3. The system will ask you for your service password Password:. Please enter it and confirm it with Enter/Return. If you do not have a service password yet or have forgotten it, please create one (see Registration/Password).
  4. You will be greeted by the cluster, followed by a shell.

~ $ ssh fr_vwxyz1234@nemo2-login.nemo.uni-freiburg.de
The authenticity of host 'nemo2-login.nemo.uni-freiburg.de (<no hostip for proxy command>)' can't be established.
ED25519 key fingerprint is SHA256:8t+lpTeEba0TmZHvSfMyrQn4WWaqquBoT3hvWRuXdGo.
This host key is known by the following other names/addresses:
...
Are you sure you want to continue connecting (yes/no/[fingerprint])? yes
Warning: Permanently added 'nemo2-login.nemo.uni-freiburg.de' (ED25519) to the list of known hosts.
(fr_vwxyz1234@nemo2-login.nemo.uni-freiburg.de) Your OTP: 123456
(fr_vwxyz1234@nemo2-login.nemo.uni-freiburg.de) Password: 
Last login: Tue Mar  4 13:48:19 2025 from 132.230.103.17
[fr_vwxyz1234@login3 ~]$


SSH Keys on NEMO2

Interactive SSH keys can be used with a second factor. You must copy your public SSH keys into:

~/.ssh/authorized_keys

Workflow and Command SSH Keys

This example uses 'rrsync' (see rrsync wiki).

If you want to use workflows and commands without a second factor, you can use SSH command keys. The command needs to look like this:

command="/usr/local/bin/rrsync -ro /home/aa/aa_bb/aa_abc1/",from="10.10.10.0/24"

Please send the following information to the NEMO2 support:

command="/usr/local/bin/rrsync -ro /home/aa/aa_bb/aa_abc1/",from="10.10.10.0/24" ssh-rsa ... username

This command then will be allowed on nemo2-login[1|2].nemo.uni-freiburg.de, example:

rsync -av nemo2-login.nemo.uni-freiburg.de:remotedir/ localdir/


Allowed Activities on Login Nodes

Attention.svg

To guarantee usability for all the users of clusters you must not run your compute jobs on the login nodes. Compute jobs must be submitted to the queuing system. Any compute job running on the login nodes will be terminated without any notice. Any long-running compilation or any long-running pre- or post-processing of batch jobs must also be submitted to the queuing system.

The login nodes of the bwHPC clusters are the access point to the compute system, your $HOME directory and your workspaces. These nodes are shared with all the users therefore, your activities on the login nodes are limited to primarily set up your batch jobs. Your activities may also be:

  • short compilation of your program code and
  • short pre- and post-processing of your batch jobs.


Related Information