FiPy requires either Pysparse, SciPy or Trilinos to be installed in order to solve linear systems. From our experiences, FiPy runs most efficiently in serial when Pysparse is the linear solver. Trilinos is the most complete of the three solvers due to its numerous preconditioning and solver capabilities and it also allows FiPy to run in parallel. Although less efficient than Pysparse and less capable than Trilinos, SciPy is a very popular package, widely available and easy to install. For this reason, SciPy may be the best linear solver choice when first installing and testing FiPy (and it is the only viable solver under Python 3.x).

FiPy chooses the solver suite based on system availability or based on the user supplied Command-line Flags and Environment Variables. For example, passing --no-pysparse:

$ python -c "from fipy import *; print DefaultSolver" --no-pysparse
<class 'fipy.solvers.trilinos.linearGMRESSolver.LinearGMRESSolver'>

uses a Trilinos solver. Setting FIPY_SOLVERS to scipy:

$ python -c "from fipy import *; print DefaultSolver"
<class 'fipy.solvers.scipy.linearLUSolver.LinearLUSolver'>

uses a SciPy solver. Suite-specific solver classes can also be imported and instantiated overriding any other directives. For example:

$ python -c "from fipy.solvers.scipy import DefaultSolver; \
>   print DefaultSolver" --no-pysparse
<class 'fipy.solvers.scipy.linearLUSolver.LinearLUSolver'>

uses a SciPy solver regardless of the command line argument. In the absence of Command-line Flags and Environment Variables, FiPy’s order of precedence when choosing the solver suite for generic solvers is Pysparse followed by Trilinos, PyAMG and SciPy.


PETSc (the Portable, Extensible Toolkit for Scientific Computation) is a suite of data structures and routines for the scalable (parallel) solution of scientific applications modeled by partial differential equations. It employs the MPI standard for all message-passing communication (see Solving in Parallel for more details).


PETSc requires the petsc4py and mpi4py interfaces.


FiPy does not implement any precoditioner objects for PETSc. Simply pass one of the PCType strings in the precon= argument when declaring the solver.


Pysparse is a fast serial sparse matrix library for Python. It provides several sparse matrix storage formats and conversion methods. It also implements a number of iterative solvers, preconditioners, and interfaces to efficient factorization packages. The only requirement to install and use Pysparse is NumPy.


FiPy requires version 1.0 or higher of Pysparse.


The scipy.sparse module provides a basic set of serial Krylov solvers, but no preconditioners.


The PyAMG package provides adaptive multigrid preconditioners that can be used in conjunction with the SciPy solvers.


The pyamgx package is a Python interface to the NVIDIA AMGX library. pyamgx can be used to construct complex solvers and preconditioners to solve sparse sparse linear systems on the GPU.


Trilinos provides a more complete set of solvers and preconditioners than either Pysparse or SciPy. Trilinos preconditioning allows for iterative solutions to some difficult problems that Pysparse and SciPy cannot solve, and it enables parallel execution of FiPy (see Solving in Parallel for more details).


Be sure to build or install the PyTrilinos interface to Trilinos.


FiPy runs more efficiently when Pysparse is installed alongside Trilinos.


Trilinos is a large software suite with its own set of prerequisites, and can be difficult to set up. It is not necessary for most problems, and is not recommended for a basic install of FiPy.


Trilinos must be compiled with MPI support for Solving in Parallel.


Trilinos parallel efficiency is greatly improved by also installing Pysparse. If Pysparse is not installed, be sure to use the --no-pysparse flag.


Trilinos solvers frequently give intermediate output that FiPy cannot suppress. The most commonly encountered messages are

Gen_Prolongator warning : Max eigen <= 0.0

which is not significant to FiPy.

Aztec status AZ_loss: loss of precision

which indicates that there was some difficulty in solving the problem to the requested tolerance due to precision limitations, but usually does not prevent the solver from finding an adequate solution.

Aztec status AZ_ill_cond: GMRES hessenberg ill-conditioned

which indicates that GMRES is having trouble with the problem, and may indicate that trying a different solver or preconditioner may give more accurate results if GMRES fails.

Aztec status AZ_breakdown: numerical breakdown

which usually indicates serious problems solving the equation which forced the solver to stop before reaching an adequate solution. Different solvers, different preconditioners, or a less restrictive tolerance may help.

Last updated on Jan 14, 2021. Created using Sphinx 3.4.3.