Difference between revisions of "BwHPC BPG for Engineering"

From bwHPC Wiki
Jump to: navigation, search
(Parallel Computing with OpenFOAM)
(OpenFOAM)
Line 10: Line 10:
 
Target System: Red-Hat-Enterprise-Linux-Server-release-6.4-Santiago
 
Target System: Red-Hat-Enterprise-Linux-Server-release-6.4-Santiago
 
Main Location: /opt/bwhpc/common
 
Main Location: /opt/bwhpc/common
Last Update: 26.02.2014
 
 
Priority: mandatory
 
Priority: mandatory
 
License: GPL
 
License: GPL

Revision as of 10:35, 19 March 2014

Navigation: bwHPC BPR


1 OpenFOAM

OpenfoamLogo.png

Module:              cae/openfoam/'version'
Target System:  Red-Hat-Enterprise-Linux-Server-release-6.4-Santiago 
Main Location:    /opt/bwhpc/common
Priority:              mandatory
License:            GPL
Homepage:        http://www.openfoam.org/

1.1 Accessing and basic usage

The OpenFOAM® (Open Field Operation and Manipulation) CFD Toolbox is a free, open source CFD software package with an extensive range of features to solve anything from complex fluid flows involving chemical reactions, turbulence and heat transfer, to solid dynamics and electromagnetics.

In order to check what OpenFOAM versions are installed on the system, run the following command:

$ module avail cae/openfoam

Typically, several OpenFOAM versions might be available.

Any available version can be accessed by loading the appropriate module:

$ module load cae/openfoam/<version>

with <version> specifying the desired version.

To activate the OpenFOAM applications, after the module is loaded, run the following:

$ source $FOAM_INIT

or simply:

$ foamInit

1.2 Parallel Computing with OpenFOAM

To improve the concurrently solving process and decrease the error occurrence probability while running an OpenFOAM job in parallel (on a singlenode), some modifications have been introduced. Specifically, after the case decomposition is done, it is recommended to save the decomposed data on a pre-allocated work-space and use it from there, for concurrency solving. When the parallel run is over, it is necessary to copy the data back to the case folder and reconstruct it. Therefore, for decomposition and reconstruction of the case, please use the following commands:

$ decomposeParHPC
$ reconstructParHPC
$ reconstructParMeshHPC

instead of:

$ decomposPar
$ reconstructPar
$ recontructParMesh

For example, if you want to runsnappyHexMeshin parallel, you may use the following commands:

$ decomposeParMeshHPC
$ mpiexec snappyHexMesh -overwrite
$ reconstructParMeshHPC -constant

instead of:

$ decomposePar
$ mpiexec snappyHexMesh -overwrite
$ reconstructParMesh -constant