BwUniCluster3.0/Software/Python Dask: Difference between revisions
< BwUniCluster3.0 | Software
Jump to navigation
Jump to search
No edit summary |
H Winkhardt (talk | contribs) m (UC2 -> UC3) |
||
(20 intermediate revisions by 4 users not shown) | |||
Line 2: | Line 2: | ||
! Navigation: [[BwHPC_Best_Practices_Repository|bwHPC BPR]] / [[BwUniCluster_User_Guide|bwUniCluster]] |
! Navigation: [[BwHPC_Best_Practices_Repository|bwHPC BPR]] / [[BwUniCluster_User_Guide|bwUniCluster]] |
||
|}--> |
|}--> |
||
This guide explains how to use Python Dask and dask-jobqueue on |
This guide explains how to use Python Dask and dask-jobqueue on bwUniCluster3.0. |
||
== Installation == |
== Installation and Usage == |
||
Please have a look at our [https://github.com/hpcraink/workshop-parallel-jupyter Workshop] on how to use Dask on bwUniCluster3.0 (2_Fundamentals: Creating Environments and 6_Dask). |
|||
Use on of our pre-configured Python modules and load them with 'module load ...'. You have to install the packages 'dask' and 'das-jobqueue' if your are using an own conda environment. |
|||
== Using Dask == |
|||
In a new interactive shell, execute the following commands in Python: |
|||
<pre> |
|||
>>> from dask_jobqueue import SLURMCluster |
|||
>>> cluster = SLURMCluster(cores=X, memory='X GB', queue='X') |
|||
</pre> |
|||
You have to specify how many cores and memory you want for one dask worker. [[BwUniCluster_2.0_Batch_Queues]] |
|||
<pre> |
|||
>>> cluster.scale (X) |
|||
</pre> |
|||
Replace |
|||
---- |
|||
[[Category:bwUniCluster_2.0|Access]][[Category:Access|bwUniCluster 2.0]] |
Latest revision as of 13:24, 1 October 2025
This guide explains how to use Python Dask and dask-jobqueue on bwUniCluster3.0.
Installation and Usage
Please have a look at our Workshop on how to use Dask on bwUniCluster3.0 (2_Fundamentals: Creating Environments and 6_Dask).