Difference between revisions of "BwHPC BPG for Engineering"

From bwHPC Wiki
Jump to: navigation, search
(Parallel Computing with OpenFOAM)
(Accessing and basic usage)
Line 26: Line 26:
 
with <version> specifying the desired version.
 
with <version> specifying the desired version.
   
In order to activate the OpenFOAM applications, after the module is loaded run the following:
+
To activate the OpenFOAM applications, after the module is loaded run the following:
 
<pre>$ source $FOAM_INIT</pre>
 
<pre>$ source $FOAM_INIT</pre>
 
or simply:
 
or simply:

Revision as of 12:41, 4 March 2014

Navigation: bwHPC BPR


1 OpenFOAM

OpenfoamLogo.png

Module:              cae/openfoam/'version'
Target System:  Red-Hat-Enterprise-Linux-Server-release-6.4-Santiago 
Main Location:    /opt/bwhpc/common
Last Update:      26.02.2014
Priority:              mandatory
License:            GPL
Homepage:        http://www.openfoam.org/

1.1 Accessing and basic usage

The OpenFOAM® (Open Field Operation and Manipulation) CFD Toolbox is a free, open source CFD software package with an extensive range of features to solve anything from complex fluid flows involving chemical reactions, turbulence and heat transfer, to solid dynamics and electromagnetics.

In order to check what OpenFOAM versions are installed on the system, run the following command:

$ module avail cae/openfoam

Typically, several OpenFOAM versions might be available.

Any available version can be accessed by loading the appropriate module:

$ module load cae/openfoam/<version>

with <version> specifying the desired version.

To activate the OpenFOAM applications, after the module is loaded run the following:

$ source $FOAM_INIT

or simply:

$ foamInit

1.2 Parallel Computing with OpenFOAM

To improve the concurrently solving process and decrease the error occurrence probability while running an OpenFOAM job in parallel, some modifications have been introduced. Specifically, after the case decomposition is done, it is recommended to save the decomposed data on a pre-allocated work-space and use it from there for concurrency solving. When the parallel run is over, it is necessary to copy the data back to the local folder and reconstruct the case. Therefore, for decomposition and reconstruction of the case, please use the following commands:

$ decomposeParHPC
$ reconstructParHPC
$ reconstructParMeshHPC

instead of:

$ decomposPar
$ reconstructPar
$ recontructParMesh

For example, if you want to runsnappyHexMeshin parallel, you may use the following commands:

$ decomposeParMeshHPC
$ mpiexec snappyHexMesh -overwrite
$ reconstructParMeshHPC -constant

instead of:

$ decomposePar
$ mpiexec snappyHexMesh -overwrite
$ reconstructParMesh -constant