Difference between revisions of "FAQ - bwUniCluster broadwell partition"
(→What happens with code compiled for old partition running on the extension partition?)
m (R Barthel moved page BwUniCluster broadwell partition to FAQ - bwUniCluster broadwell partition: this is a FAQ therefore renaming)
Revision as of 11:06, 18 September 2017
FAQs concerning best practice of bwUniCluster broadwell partition (aka "extension" partition).
- 1 Login
- 2 Compilation
- 3 Job execution
1.1 Are there separate login nodes for the bwUniCluster broadwell partition?
- Yes, but primarily to be used for compiling code.
1.2 How to login to broadwell login nodes?
- You can directly login on broadwell partition login nodes using
- If you login to the old uc1 login nodes, even though you can use broadwell nodes using the same procedure as 'compute nodes'.
2.1 How to compile code on broadwell (extension) nodes?
On uc1 (old) login nodes:
On uc1e (extension) login nodes:
2.2 How to compile the same code on old and extension partition?
On uc1e (= extension) login nodes:
icc/ifort -xAVX -axCORE-AVX2
2.3 What happens with code compiled for old partition whic is running on the extension partition?
Code will run but significantly slower since AVX2 will not be used. Please recompile your code accordingly.
3 Job execution
3.1 How to submit jobs to the broadwell (= extension) partition
The submitted job will be distributed either way to the broadwell nodes if specified correctly, i.e.:
msub -q multinode