BwUniCluster2.0/Software/Python Dask: Difference between revisions
< BwUniCluster2.0 | Software
Jump to navigation
Jump to search
No edit summary |
No edit summary |
||
Line 14: | Line 14: | ||
>>> cluster = SLURMCluster(cores=X, memory='X GB', queue='X') |
>>> cluster = SLURMCluster(cores=X, memory='X GB', queue='X') |
||
</pre> |
</pre> |
||
You have to specify how many cores and memory you want for one dask worker. [[BwUniCluster_2. |
You have to specify how many cores and memory you want for one dask worker. [[BwUniCluster_2.0_Batch_Queues]] |
||
<pre> |
<pre> |
Revision as of 10:55, 19 April 2020
This guide explains how to use Python Dask and dask-jobqueue on bwUniCluster2.0.
Installation
Use on of our pre-configured Python modules and load them with 'module load ...'. You have to install the packages 'dask' and 'das-jobqueue' if your are using an own conda environment.
Using Dask
In a new interactive shell, execute the following commands in Python:
>>> from dask_jobqueue import SLURMCluster >>> cluster = SLURMCluster(cores=X, memory='X GB', queue='X')
You have to specify how many cores and memory you want for one dask worker. BwUniCluster_2.0_Batch_Queues
>>> cluster.scale (X)
Replace