Difference between revisions of "BinAC/Quickstart Guide"

From bwHPC Wiki
Jump to: navigation, search
(No difference)

Revision as of 16:24, 18 July 2017

1 Basics

Use the work file system and not your home directory for your calculations. Create a working directory using your username.

cd /beegfs/work/
mkdir <username>
cd <username>

Do not use the login nodes to carry out any calculations or heavy load file transfers.

2 Check the Queue

To check all running and queued jobs.

qstat

Just your own jobs.

qstat -u <username>

3 Simple Interactive Job

To start a 1 core job on a compute node providing a remote shell.

qsub -q short -l nodes=1:ppn=1 -I

The same but requesting the whole node.

qsub -q short -l nodes=1:ppn=28 -I

Standard Unix commands are directly available, for everything else use the modules.

module avail

Just an example

module load chem/gromacs/4.6.7-gnu-4.9
g_luck

Be aware that we allow node sharing. Do not disturb the calculations of other users.

4 Simple Script Job

Use your favourite text editor to create a script.

#PBS -l nodes=1:ppn=1
#PBS -l walltime=00:05:00
#PBS -S /bin/bash
#PBS -N Simple_Script_Job
#PBS -j oe
#PBS -o LOG
#PBS -n
cd $PBS_O_WORKDIR
echo "my Username is:"
whoami
echo "My job is running on node:"
uname -a
module load chem/gromacs/4.6.7-gnu-4.9
g_luck

Submit the job using

qsub -q short script.sh

Take a note of your jobID.

5 Killing a Job

Let's assume you build a Homer and want to stop/kill/remove a running job.

qdel <jobID>

6 Fancy Script Job

#PBS -l nodes=1:ppn=28:gpus=4:exclusive_process
#PBS -l walltime=36:00:00
#PBS -S /bin/bash
#PBS -N Gromacs_GPU
#PBS -j oe
#PBS -o LOG
#PBS -n
module purge
module load devel/cuda/7.5 mpi/mvapich2/2.1-gnu-4.9 numlib/openblas/0.2.18-gnu-4.9
source /opt/bwhpc/common/chem/gromacs/2016_gnu-4.9/bin/GMXRC.bash
cd $PBS_O_WORKDIR
gmx grompp -f NPT.mdp -c protein.pdb -n index.ndx -p topol.top
mdrun_s_gpu -v -deffnm NPT_protein -pin on -ntmpi 4 -ntomp 7 -gpu_id 0123 -s topol.tpr

Submit with

qsub -q gpu script.sh

There are tons of options, details and caveats. If there is anything not working, as you like, send an email to hpcmaster@uni-tuebingen.de.