Installing and Running on Ubuntu

Users interested to work with PFLOTRAN-OGS at code level, can proceed with the installation and build from source code.

Ubuntu verions 16.04 and 18.04 are supported but not currently 20.04.

Below instructions are provided to install the software on the Ubuntu Desktop system. Support for clients working with their own builds is arranged on a case-by-case basis. Please get in touch with OpenGoSim to know more.

Requirements

We assume an up to date Ubuntu 16.04 or 18.04 installation. The following packages from apt are also required. If you are interested in understanding why each package is needed, we describe them individually below. Otherwise, the following command to install all at once should be suffient:

sudo apt install git build-essential gfortran python python-six flex bison

Looking at these individually:

  • git: The software managment tool git will be needed to download PFLOTRAN_OGS and PETSc (see below).

  • build-essential and gfortran provide compilers for Fortran, C, C++, and other necesasary compilation tools and librares.

  • python and python_six: Python, and this specific module, are required for several install scripts (note Python3 won’t work).

  • Flex and Bison are required by PETSc (see below).

The other major requirment for PFLOTRAN-OGS is the library PETSc, which we will install in the next section. PETSc has a number of other dependencies we will need, but fortunately it can be configured to download and install all of these automatically, see the next section.

Downloading PFLOTRAN-OGS

Use git to clone the PFLOTRAN-OGS.

For OGS-1.5:

git clone https://bitbucket.org/opengosim/pflotran_ogs_1.5.git

This will place PFLOTRAN-OGS in a folder called pflotran-ogs-1.5.

For OGS-1.4 please use:

git clone https://bitbucket.org/opengosim/pflotran_ogs_1.4.git

This will place PFLOTRAN-OGS in a folder called pflotran_ogs_1.4.

Installing PETSc

Now we will download and install the PETSc library.

Create a directory for PETSc, for example:

mkdir petsc

Clone the PETSc repository from git into that directory using the following command:

git clone https://gitlab.com/petsc/petsc petsc

For compiling PETSc and PFLOTRAN-OGS we have to set some simple environmental variables. This is further explained in PETSc environmental variables but is not required reading.

First we must set PETSC_DIR to the path of the directory we cloned PETSc into, so for example:

export PETSC_DIR=/home/myusername/petsc

We must also set a variable called PETSC_ARCH. This is used for managing multiple installs/configurations of PETSc, so we could use anything vaguely descriptive. Following some standard convention let’s use:

export PETSC_ARCH=ubuntu-opt

because we’re installing on Ubuntu, and we’re going to configure for a fast running (optimised ‘opt’, as opposed to debugging, ‘dbg’) version of PETSc.

We’re going to need these variables defined every time we do any compilation of PETSc or PFLOTRAN, so it might be convinient to also paste these two lines into the .bashrc file so that they are defined every time a new terminal window is opened.

PFLOTRAN-OGS needs a specific snapshot of PETSc. We can check what this is in the file petsc-git-version.txt, included in the copy of PFLOTRAN-OGS we downloaded. For example:

nano pflotran_ogs_1.5/tools/buildbot/petsc/petsc-git-version.txt

For example the version at time of writing is v3.12.2, which we will use in this tutorial.

Next we go into the PETSc directory:

cd petsc

And checkout the specific snapshot which PFLOTRAN-OGS needs, for example:

git checkout v3.12.2

Now we configure PETSc. Among other things, this is the part where PETSc will download and install any dependent libraries if we ask it to. So we will ask it to:

./configure --download-mpich=yes --download-hdf5=yes --download-fblaslapack=yes --download-metis=yes  --download-cmake=yes  --download-ptscotch=yes --download-hypre=yes --with-debugging=0 COPTFLAGS=-O3 CXXOPTFLAGS=-O3 FOPTFLAGS=-O3

For advanced user information about compiling PETSc against existing installs on any of these libraries, for example mpich or replacing it with openmpi, see this section.

The configure process will take a litle time. Eventually the configuration script will finish, and even tell us how to proceed. We should see and output like this at the end:

xxx=========================================================================xxx
 Configure stage complete. Now build PETSc libraries with:
   make PETSC_DIR=/home/myusername/petsc PETSC_ARCH=ubuntu-opt all
xxx=========================================================================xxx

So we copy paste this into the terminal:

make PETSC_DIR=/home/myusername/petsc PETSC_ARCH=ubuntu-opt all

Then we wait again for the build to complete. Eventually we will see more advice from the scripts:

Now to check if the libraries are working do:
make PETSC_DIR=/home/myusername/petsc PETSC_ARCH=ubuntu-opt check
=========================================

We again oblige by copy pasting:

make PETSC_DIR=/home/myusername/petsc PETSC_ARCH=ubuntu-opt check

Giving an output something like:

ubuntu-opt check
Running test examples to verify correct installation
Using PETSC_DIR=/home/daniel/new_petsc and PETSC_ARCH=ubuntu-opt
C/C++ example src/snes/examples/tutorials/ex19 run successfully with 1 MPI process
C/C++ example src/snes/examples/tutorials/ex19 run successfully with 2 MPI processes
C/C++ example src/snes/examples/tutorials/ex19 run successfully with hypre
C/C++ example src/vec/vec/examples/tutorials/ex47 run successfully with hdf5
Fortran example src/snes/examples/tutorials/ex5f run successfully with 1 MPI process
Completed test examples

Installing PFLOTRAN-OGS

Now PETSc should be installed correctly and we are ready to move on to PFLOTRAN-OGS.

We have already downloaded PFLOTRAN-OGS through git, and have it in a folder called pflotran_ogs_1.5.

Navigate into the following directory:

cd pflotran_ogs_1.5/src/pflotran

Now we compile (in this case with four cores, to speed things up) by entering:

make -j4 pflotran

This should take a few minutes, after which, PFLOTRAN-OGS will be ready to use.

Testing the PFLOTRAN-OGS Installation - Regression Tests

A set of fast automatic regression tests can be run to check that the PFLOTRAN-OGS installation is working correctly.

To do this, after installing in the previous section, in the same directory, enter:

make test

This will run the tests. First a number of unit tests will be run, to check individual parts of the code, then the regression tests will be run, to check the overall simulator.

When running the regression tests, the screen will look like this:

Running pflotran regression tests :

  Legend

    . - success
    F - failed regression test (results are outside error tolerances)
    M - failed regression test (results are FAR outside error tolerances)
    G - general error
    U - user error
    X - code crashed
    T - time out error
    C - configuration file [.cfg] error
    I - missing information (e.g. missing files)
    A - pre-processing error (e.g. error in simulation setup scripts
    B - post-processing error (e.g. error in solution comparison)
    S - test skipped
    W - warning
    ? - unknown

....................................................................................................................................................

--------------------------------------------------------------------------------
Regression test summary:
    Total run time: 106.924 [s]
    Total tests : 148
    Tests run : 148
    All tests passed.

The regression tests are simple simulation models that can be completed very quickly. The values of various physical properties (e.g. pressure, gas saturation, etc.) in certain cells are output to a .regression file at the end of the run. The regression test script compares these values to the same values from an earlier run, deemed to be correct (called “gold” values), and alerts us if the values differ by a nontrivial amount.

Thus the regression tests tell us if the physical properties computed by the simulator have changed since the “gold” values were defined.

After running the regression tests, we should see all dots, indicating all successful tests.

It is possible for some regression tests to fail simply due to slight changes in system/compiler setup causing changes in, for example, rounding error, causing the final physical values to deviate from the “gold” values by slightly more than the specified tolerances, which are quite strict.

In this case the above output will show some “F”s instead of “.”s, indicating the failures. Up to a few, say five, failures are likely due to be due to minor system variations as just described, but many more or all “F”s indicate something has gone wrong.

Another possible outcome is that some tests will show “T”, indicating that they have exceeded 60 seconds in runtime, which is considrered a failure. This can be the case when an outdated machine is used. This can also happen when the machine has few cores, and or OpenMPI has not been setup correctly (note that in the PETSc install above we allowed PETSc to mpich itself, but we may not always choose to do that, see later sections).

Other outputs indicate more serious problems. Seeing “M” indicates that a regression test has failed by producing output very different from what was expected, which cannot be explained by differences between systems. Seeing “G”, “U” or “X” indicates that the simulator has thrown an error or outright crashed. In all these cases the install has not been successful.

To take a closer look at the results of the tests, a full log is stored in:

[pflotran directory]/regression_tests/pflotran-tests-[time tests were run].testlog

An example might be:

[home directory]/pflotran_ogs_1.5/regression_tests/pflotran-tests-2021-06-23_12-12-44.testlog

In this log file we can see details of all the tests that were run. Most importantly, if a test has failed, we can see additional information. If a test failed with “F” or “M”, we can see the deviation between physical values and the “gold values. An example of an “F” is shown below:

cp_np2...

  Run...
    cd /shared/pflotran/regression_tests/towg/bo
    /opt/intel/compilers_and_libraries_2019/linux/mpi/intel64/bin/mpiexec -np 2 /shared/pflotran/src/pflotran/pflotran -malloc_debug no -successful_exit_code 86 -input_prefix cp_np2
    # cp_np2 : run time : 0.40 seconds
    diff cp_np2.regression.gold cp_np2.regression
    FAIL: Liquid Energy:Min : 1.90003568434e-12 > 1e-12 [absolute]
    FAIL: Liquid Energy:1 : 2.19979590099e-12 > 1e-12 [absolute]
    FAIL: Liquid Energy:Mean : 4.49995596341e-12 > 1e-12 [absolute]
    FAIL: Oil Density:Min : 5.50016920897e-10 > 1e-12 [absolute]
    FAIL: Oil Density:300 : 5.50016920897e-10 > 1e-12 [absolute]
    FAIL: Oil Density:Max : 1.5500063455e-09 > 1e-12 [absolute]
    FAIL: Oil Density:1 : 1.8300170268e-09 > 1e-12 [absolute]
    FAIL: Oil Density:Mean : 3.73995590053e-09 > 1e-12 [absolute]
    FAIL: Oil Energy:Min : 2.34499974994e-10 > 1e-12 [absolute]
    FAIL: Oil Energy:300 : 7.69997399175e-11 > 1e-12 [absolute]
    FAIL: Oil Energy:Max : 7.69997399175e-11 > 1e-12 [absolute]
    FAIL: Oil Energy:1 : 2.72299960358e-10 > 1e-12 [absolute]
    FAIL: Oil Energy:Mean : 5.42000000436e-10 > 1e-12 [absolute]
    FAIL: Gas Density:Min : 3.73000830223e-09 > 1e-12 [absolute]
    FAIL: Gas Density:300 : 3.73000830223e-09 > 1e-12 [absolute]
    FAIL: Gas Density:Max : 8.8799652076e-09 > 1e-12 [absolute]
    FAIL: Gas Density:1 : 1.24500161292e-08 > 1e-12 [absolute]
    FAIL: Gas Density:Mean : 2.50300047355e-08 > 1e-12 [absolute]
    FAIL: Gas Energy:Min : 1.05995212607e-11 > 1e-12 [absolute]
    FAIL: Gas Energy:Mean : 2.50022225146e-12 > 1e-12 [absolute]
    Skipping SOLUTION : Flow
cp_np2... failed.

For example, the average gas density has devaiated from the expected by about 1.0e-8, which is not much, but this is greater than the 1.0e-12 tolerance expected. It is always worth double checking that regression test failures are of this trivial sort when running the regression tests.

In the case of “G”, “U” or “X” failures, the output from the simulator can be found in the log file, often allowing us to see error messages useful for debugging.

After running the tests, if everything went fine, we can clean up, which includes deleting the log files, with :

make clean-tests

Running Pflotran From the Command Line

This section will help you understand the options available for running PFLOTRAN-OGS from the command line.

Through Scripts

In practice it is much more convinient to run PFLOTRAN-OGS through simple scripts. We provide an example here. Please open the file and read the short instructions before running. In particualar note that it must be updated with the path of your pflotran install directory (e.g., /home/myusername/pflotran), and needs the PETSC_DIR and PETSC_ARCH variables to be defined (see above).

Directly Through the Command Line

A typical call to run PFLOTRAN-OGS from the command line will look like

/home/myusername/petsc/ubuntu-opt/bin/mpirun -np 4 /home/myusername/pflotran/src/pflotran -pflotranin spe10.in -output_prefix test_spe10_run

We now explain each part of this:

  • /home/myusername/petsc/ubuntu-opt/bin/mpirun : we call mpirun to start a parallel program. Note that this is the mpirun binary that was installed as part of the PETSc configuration process above. We need to make sure we use the same MPI installation for compiling and running PFLOTRAN-OGS.

  • -np 4 : an argument to mpirun, specifying how many proceseses to use. In this case, four.

  • /home/myusername/pflotran/src/pflotran : this is the PFLOTRAN-OGS binary we compiled earlier.

  • -pflotranin spe10.in : this is an argument to PFLOTRAN-OGS, specifying an input file to use, in this case called ‘spe10.in’.

  • -output_prefix test_spe10_run : specify an output prefix to be applied to all output files generated by this run.

Advanced PETSc Options

Configuring against existing Open MPI Install

Here is an example:

CONFIGURE_OPTIONS = --with-debugging=0 --download-fblaslapack=1 --with-fc=/usr/lib64/openmpi/bin/mpif90 --with-cc=/usr/lib64/openmpi/bin/mpicc --with-cxx=/usr/lib64/openmpi/bin/mpicxx --with-mpi-include=/usr/include/openmpi-x86_64 --with-mpi-lib=/usr/lib64/openmpi/lib/libmpi.so --download-cmake=1 --download-ptscotch=1 -download-hypre=1 --download-hdf5=1 --with-c2html=0 COPTFLAGS=-O3 CXXOPTFLAGS=-O3 FOPTFLAGS=-O3 --with-shared-libraries=0

Note the lack of the --download-mpich=yes or --download-openmpi=yes options, as well as the presence of the following:

  • A path to an MPI fortran compiler: --with-fc=/usr/lib64/openmpi/bin/mpif90

  • A path to an MPI c compiler: --with-cc=/usr/lib64/openmpi/bin/mpicc

  • A path to an MPI c++ compiler: --with-cxx=/usr/lib64/openmpi/bin/mpicxx

  • A path to to the MPI include directory: --with-mpi-include=/usr/include/openmpi-x86_64

  • A path to the MPI libraries: --with-mpi-lib=/usr/lib64/openmpi/lib/libmpi.so

Note that it might be necessary to ensure certain libraries are in path, e.g.

export LD_LIBRARY_PATH=/usr/lib64/openmpi/lib:$LD_LIBRARY_PATH

Recall that you must use the mpirun binary associted with the Open MPI install for running PFLOTRAN-OGS, so the above example of running on the command line will generalizes to:

/location/of/correct/mpirun -np 4 /home/myusername/pflotran/src/pflotran -pflotranin spe10.in -output_prefix test_spe10_run