DMPlexBuildFromCellListParallel#
Build distributed DMPLEX topology from a list of vertices for each cell (common mesh generator output)
Synopsis#
#include "petscdmplex.h"
#include "petscdmplex.h"
PetscErrorCode DMPlexBuildFromCellListParallel(DM dm, PetscInt numCells, PetscInt numVertices, PetscInt NVertices, PetscInt numCorners, const PetscInt cells[], PetscSF *vertexSF, PetscInt **verticesAdjSaved)
Input Parameters#
dm - The DM
numCells - The number of cells owned by this process
numVertices - The number of vertices to be owned by this process, or PETSC_DECIDE
NVertices - The global number of vertices, or PETSC_DETERMINE
numCorners - The number of vertices for each cell
cells - An array of numCells*numCorners numbers, the global vertex numbers for each cell
Output Parameters#
vertexSF - (Optional) SF describing complete vertex ownership
verticesAdjSaved - (Optional) vertex adjacency array
Notes#
Two triangles sharing a face
2
/ | \
/ | \
/ | \
0 0 | 1 3
\ | /
\ | /
\ | /
1
would have input
numCells = 2, numVertices = 4
cells = [0 1 2 1 3 2]
which would result in the DMPlex
4
/ | \
/ | \
/ | \
2 0 | 1 5
\ | /
\ | /
\ | /
3
Vertices are implicitly numbered consecutively 0,…,NVertices. Each rank owns a chunk of numVertices consecutive vertices. If numVertices is PETSC_DECIDE, PETSc will distribute them as evenly as possible using PetscLayout. If NVertices is PETSC_DETERMINE and numVertices is PETSC_DECIDE, NVertices is computed by PETSc as the maximum vertex index in cells + 1. If only NVertices is PETSC_DETERMINE, it is computed as the sum of numVertices over all ranks.
The cell distribution is arbitrary non-overlapping, independent of the vertex distribution.
Not currently supported in Fortran.
See Also#
DMPlexBuildFromCellList()
, DMPlexCreateFromCellListParallelPetsc()
, DMPlexBuildCoordinatesFromCellListParallel()
Level#
advanced
Location#
src/dm/impls/plex/plexcreate.c
Index of all DMPlex routines
Table of Contents for all manual pages
Index of all manual pages