XGCa
 All Classes Namespaces Files Functions Variables Typedefs Enumerations Enumerator Friends Macros Pages
Public Member Functions | List of all members
petsc_solver_module Module Reference

Public Member Functions

subroutine init_ksp_comm (nnode)
 Initializes an MPI communicator for use with PETSc KSP solves (Poisson, Ampere). The size of the KSP comm group is set such that, if possible, the number of equations per MPI rank is larger than 5000, which is roughly the weak scaling rollover. More...
 
subroutine petsc_get_sizes (grid, bc, n_equation, n_boundary, xgc_petsc, petsc_xgc_bd)
 Calculate (i) the number of equations (vertices) of the solver, (ii) the number of XGC boundary vertices included in the solver, (iii) (preliminary) mapping from XGC vertices to PETSc equations, (iv) mapping from XGC vertices to PETSc boundary conditions, all based the XGC boundary object passed as input. More...
 
subroutine petsc_get_bd_map (grid, bc, n_boundary, petsc_xgc_bd, petsc_bd_xgc)
 Generate mapping from PETSc boundary conditions to XGC vertices. This requires the results of petsc_get_sizes because the number of boundary conditions is not known a priori. More...
 
subroutine getnnz (grid, nloc, low, high, d_nnz, o_nnz, xgc_petsc, nglobal, ierr)
 Computes the number of non-zero entries nnz per row in the rank-local rows. More...
 
subroutine petsc_get_partitioning (grid, bc, solver_data, comm, num_pe, n_eq_tot_in, set_diffusion_matrix, xgc_petsc, petsc_xgc_bd, n_eq_loc, xgc_proc_out, proc_eq, ierr)
 Generates a domain partitioning of part of the XGC mesh used by a solver (as defined by the boundary conditions bc) using Parmetis and generates a mapping between PETSc equation index and XGC MPI ranks (those in the communicator comm used by the solver). More...
 
subroutine petsc_get_mapping (nnode, bc, comm, num_pe, n_eq_tot, n_eq_loc, xgc_proc, proc_eq, xgc_petsc, petsc_xgc, petscloc_xgc)
 Based on the Parmetis partition of the XGC domain, updates the mappings between XGC vertices and PETSc equations. More...
 
subroutine petsc_get_template_mat (grid, comm, n_eq_tot, n_eq_loc, xgc_petsc, solver_template_mat, ierr)
 Uses pre-computed (petsc_get_partitioning) local matrix sizes and XGC vertex to PETSc equation mapping to set up a blank template matrix with sufficient pre-allocated memory. More...
 
subroutine petsc_get_bc_mat (solver, n_eq_tot, n_eq_loc, ierr)
 Creates the (empty) LHS and RHS matrices for Dirichlet boundary conditions. More...
 
subroutine petsc_get_fsa_mat (solver, n_eq_tot, n_eq_loc, ierr)
 creates and allocates interior/surface FSA matrices More...
 
subroutine petsc_get_fsa_bc_mat (solver, ierr)
 creates and allocates boundary/surface FSA matrices More...
 
subroutine petsc_get_scatter_one_block (solver, nnode, n_eq_tot, n_eq_loc, petsc_xgc, ierr)
 Generate PETSc scatter mapping from XGC vertices to PETSc equation for a simple solver with one equation (block). Also sets up the corresponding LHS and RHS PETSc vectors. More...
 
subroutine petsc_get_scatter_multi_block (nnode, ksp, mat_all_blocks, mat_one_block, blocksize, varnames, n_eq_tot, n_eq_loc, petscloc_xgc, petsc_xgc, iss, to_petsc, from_petsc, ierr)
 Sets up a PETSc scatter object for a (2D) multi-block field-split solver. More...
 
subroutine petsc_to_xgc (nnode, blocksize, from_petsc, field_petsc, field_xgc, ierr)
 This routine scatters values from the distributed vector compatible with the block-matrix of the 2D diffusion model to local variables on the XGC solver grid. More...
 
subroutine petsc_to_xgc_bd (nn, this, x, xvec)
 Scatters a PETSc vector to XGC boundary data. More...
 
subroutine xgc_to_petsc (nnode, blocksize, to_petsc, field_petsc, field_xgc, ierr)
 This routine scatters values from local variable on the XGC solver grid to the distributed vector compatible with the block-matrix of the 2D diffusion model. More...
 
subroutine xgc_to_petsc_bd (nn, this, x, xvec)
 Scatters XGC boundary data to a PETSc vector. More...
 
subroutine create_1field_solver (solver, mat, ierr)
 Creates a PETSc KSP solver for a one-block system, i.e., for a single equation on the XGC mesh. More...
 
subroutine create_axisym_iter_solver (solver, ierr)
 Creates and sets up the PETSc KSP solver and preconditioner for inverting the axisymmetric Poisson operator. More...
 
subroutine diffusion_matrix_init (diffusion_ts, grid, bc, solver_template_mat, ierr)
 Set up a template matrix for XGC's anomalous diffusion time integrator. This is currently a system of either 4 (adiabatic electrons) or 7 (kinetic electrons) equations. Impurities are not supported yet. More...
 
subroutine set_adiabatic_blending_weights (grid, vec, xgc_petsc, ierr)
 Sets values in a PETSc vector according to the blending function \(\alpha(\psi)\), where \(\alpha = 1\) in region 1 for \(\psi \le \psi_{\mathrm{in}}\), \(\alpha = \frac{\psi_{\textrm{out}} - \psi}{\psi_{\textrm{out}} - \psi_{\textrm{in}}}\) in region 1 for \(\psi_{\mathrm{in}} \lt \psi \lt \psi_{\mathrm{out}}\), and \(\alpha = 0\) in region 1 for \(\psi_{\mathrm{out}} \le \psi\) and outside of region 1. More...
 
subroutine create_helmholtz_solver (solver, solver_data, grid, bc, ierr)
 Set up a KSP solver for a single Helmholtz-type equation (Poisson equation, Ampere's law) More...
 
subroutine create_spectral_helmholtz_solver (solver, solver_data, grid, bc, ntor, ierr)
 Set up a KSP solver for a single toroidal mode number component of Helmholtz-type equation (Poisson equation, Ampere's law) More...
 
subroutine init_helmholtz_solver (psn, grid, solver, solver_data, bd, comm, prefix, n_rhs_mat, is_axisym, is_spectral, is_ampere, is_ampere_cv, is_update)
 Setup routine for the Poisson and the Ampere's law solvers This routine can be used for the initial set up, i.e., initializing the the distributed PETSc matrices and the KSP solver, and for updating exisiting solvers for the evolving background profiles. More...
 

Member Function/Subroutine Documentation

subroutine petsc_solver_module::create_1field_solver ( type(xgc_solver)  solver,
  mat,
intent(out)  ierr 
)

Creates a PETSc KSP solver for a one-block system, i.e., for a single equation on the XGC mesh.

Parameters
[in,out]solverXGC solver object, type(xgc_solver)
[in]matMatrix from which to create KSP, Petsc Mat
[out]ierrPETSc Error code

Here is the call graph for this function:

Here is the caller graph for this function:

subroutine petsc_solver_module::create_axisym_iter_solver ( type(xgc_solver)  solver,
intent(out)  ierr 
)

Creates and sets up the PETSc KSP solver and preconditioner for inverting the axisymmetric Poisson operator.

Parameters
[in,out]solverXGC solver object, type(xgc_solver)
[out]ierrPETSc Error code

Here is the call graph for this function:

Here is the caller graph for this function:

subroutine petsc_solver_module::create_helmholtz_solver ( type(xgc_solver)  solver,
type(solver_init_data)  solver_data,
type(grid_type), intent(in)  grid,
integer, dimension(grid%nnode), intent(in)  bc,
  ierr 
)

Set up a KSP solver for a single Helmholtz-type equation (Poisson equation, Ampere's law)

Parameters
[in,out]solverXGC solver object, type(xgc_solver), see module_psn.F90
[in]solver_dataGrid and magnetic field data needed for the perp. gradient
[in]gridXGC grid object, type(grid_type)
[in]bcSolver boundary mask, 1 if not in boundary vertex list
[out]ierrPETSc error code.

Here is the call graph for this function:

Here is the caller graph for this function:

subroutine petsc_solver_module::create_spectral_helmholtz_solver ( type(xgc_solver)  solver,
type(solver_init_data)  solver_data,
type(grid_type), intent(in)  grid,
integer, dimension(grid%nnode), intent(in)  bc,
integer, intent(in)  ntor,
  ierr 
)

Set up a KSP solver for a single toroidal mode number component of Helmholtz-type equation (Poisson equation, Ampere's law)

Parameters
[in,out]solverXGC solver object, type(xgc_solver), see module_psn.F90
[in]solver_dataGrid and magnetic field data needed for the perp. gradient
[in]gridXGC grid object, type(grid_type)
[in]bcSolver boundary mask, 1 if not in boundary vertex list
[in]ntorToroidal mode number for which to set up the solver
[out]ierrPETSc error code.

Here is the call graph for this function:

Here is the caller graph for this function:

subroutine petsc_solver_module::diffusion_matrix_init ( type(xgc_ts)  diffusion_ts,
type(grid_type), intent(in)  grid,
integer, dimension(grid%nnode), intent(in)  bc,
intent(out)  solver_template_mat,
  ierr 
)

Set up a template matrix for XGC's anomalous diffusion time integrator. This is currently a system of either 4 (adiabatic electrons) or 7 (kinetic electrons) equations. Impurities are not supported yet.

Parameters
[in,out]diffusion_tsPETSc time integrator object (TS) for the diffusion solver
[in]gridXGC grid object, type(grid_type)
[in]bcSolver boundary mask, 1 if not in boundary vertex list
[out]solver_template_matPETSc matrix with the template for the anomalous diffusion solver
[out]ierrPETSc error code.

Here is the call graph for this function:

Here is the caller graph for this function:

subroutine petsc_solver_module::getnnz ( type(grid_type), intent(in)  grid,
  nloc,
  low,
  high,
  d_nnz,
  o_nnz,
dimension(grid%nnode), intent(in)  xgc_petsc,
  nglobal,
  ierr 
)

Computes the number of non-zero entries nnz per row in the rank-local rows.

Parameters
[in]gridXGC grid object, type(grid_type)
[in]nlocNumber of equations on this rank, PetscInt
[in]lowGlobal index of the first equation on this rank, PetscInt
[in]highGlobal index of the last equation on this rank, PetscInt
[out]d_nnzNumber of non-zero entries in "diagonal" local part of the matrix
[out]o_nnzNumber of non-zero entries in the "off-diagnonal" local part of the matrix.
[in]xgc_petscMapping between XGC grid vertices and PETSc equations
[in]nglobalTotal number of equations, PetscInt
[in]ierrPETSc error code

Here is the caller graph for this function:

subroutine petsc_solver_module::init_helmholtz_solver ( type(psn_type), intent(inout)  psn,
type(grid_type), intent(in)  grid,
type(xgc_solver), intent(inout)  solver,
type(solver_init_data), intent(in)  solver_data,
integer, dimension(grid%nnode), intent(in)  bd,
integer, intent(in)  comm,
character(*), intent(in)  prefix,
integer, intent(in)  n_rhs_mat,
logical, intent(in)  is_axisym,
logical, intent(in)  is_spectral,
logical, intent(in)  is_ampere,
logical, intent(in)  is_ampere_cv,
logical, intent(in)  is_update 
)

Setup routine for the Poisson and the Ampere's law solvers This routine can be used for the initial set up, i.e., initializing the the distributed PETSc matrices and the KSP solver, and for updating exisiting solvers for the evolving background profiles.

Parameters
[in,out]psnXGC field object, type(psn_type)
[in]gridXGC grid object, type(grid_type)
[in,out]solverXGC solver object, type(xgc_solver_type)
[in]solver_dataData required for solver setup (n, T, etc.), type(solver_init_data)
[in]bdSolver boundary mask, 1 if not in boundary vertex list
[in]commCommunicator of the solver, integer
[in]prefixPrefix string for the solver, character
[in]n_rhs_matNumber of right-hand side operators (1 or 2), integer
[in]is_axisymSet up for axisymmetric solver if .true., logical
[in]is_spectralSet up for spectral solver if .true., logical
[in]is_ampereSet up Ampere's law solver if .true., logical
[in]is_ampere_cvSet up control-variate Ampere's law solver, logical if .true., logical
[in]is_updateIf .true. update an already initialized solver, logical

Here is the call graph for this function:

Here is the caller graph for this function:

subroutine petsc_solver_module::init_ksp_comm ( integer, intent(in)  nnode)

Initializes an MPI communicator for use with PETSc KSP solves (Poisson, Ampere). The size of the KSP comm group is set such that, if possible, the number of equations per MPI rank is larger than 5000, which is roughly the weak scaling rollover.

Parameters
[in]nnodeNumber of mesh vertices per poloidal plane, integer

Here is the call graph for this function:

Here is the caller graph for this function:

subroutine petsc_solver_module::petsc_get_bc_mat ( type(xgc_solver)  solver,
intent(in)  n_eq_tot,
intent(in)  n_eq_loc,
intent(out)  ierr 
)

Creates the (empty) LHS and RHS matrices for Dirichlet boundary conditions.

Parameters
[in,out]solverXGC solver object
[in]n_eq_totNumber of equations to solve, PetscInt
[in]n_eq_locNumber of equations on this rank, PetscInt
[out]ierrPETSc error code

Here is the caller graph for this function:

subroutine petsc_solver_module::petsc_get_bd_map ( type(grid_type), intent(in)  grid,
integer, dimension(grid%nnode), intent(in)  bc,
intent(in)  n_boundary,
dimension(grid%nnode), intent(in)  petsc_xgc_bd,
dimension(n_boundary), intent(out)  petsc_bd_xgc 
)

Generate mapping from PETSc boundary conditions to XGC vertices. This requires the results of petsc_get_sizes because the number of boundary conditions is not known a priori.

Parameters
[in]gridXGC solver mesh, type(grid_type)
[in]bcSolver boundary mask, 1 if not in boundary vertex list
[in]n_boundaryNumber of boundary values, PetscInt
[in]petsc_xgc_bdMapping from XGC vertices to PETSc boundary conditions
[out]petsc_bd_xgcMapping fron PETSc boundary conditions to XGC vertices

Here is the caller graph for this function:

subroutine petsc_solver_module::petsc_get_fsa_bc_mat ( type(xgc_solver)  solver,
intent(out)  ierr 
)

creates and allocates boundary/surface FSA matrices

Parameters
[in,out]solverXGC solver object
[out]ierrPETSc error code

Here is the caller graph for this function:

subroutine petsc_solver_module::petsc_get_fsa_mat ( type(xgc_solver)  solver,
intent(in)  n_eq_tot,
intent(in)  n_eq_loc,
intent(out)  ierr 
)

creates and allocates interior/surface FSA matrices

Parameters
[in,out]solverXGC solver object
[in]n_eq_totNumber of equations to solve, PetscInt
[in]n_eq_locNumber of equations on this rank, PetscInt
[out]ierrPETSc error code

Here is the caller graph for this function:

subroutine petsc_solver_module::petsc_get_mapping ( integer, intent(in)  nnode,
integer, dimension(nnode), intent(in)  bc,
integer, intent(in)  comm,
integer, intent(in)  num_pe,
intent(in)  n_eq_tot,
intent(in)  n_eq_loc,
dimension(n_eq_tot), intent(in)  xgc_proc,
dimension(0:num_pe), intent(inout)  proc_eq,
dimension(nnode), intent(out)  xgc_petsc,
dimension(n_eq_tot), intent(out)  petsc_xgc,
dimension(n_eq_loc), intent(out)  petscloc_xgc 
)

Based on the Parmetis partition of the XGC domain, updates the mappings between XGC vertices and PETSc equations.

Parameters
[in]nnodeNumber of XGC vertices, integer
[in]bcSolver boundary mask, 1 if not in boundary vertex list
[in]commMPI comunicator for distributed matrix solver, MPI_comm
[in]n_eq_totTotal number of PETSc equations, PetscInt
[in]n_eq_locLocal number of equations on the current rank, PetscInt
[in]xgc_procMapping from equation index to XGC rank, PetscInt
[in]proc_eqMapping from XGC process to PETSc equation, integer
[out]xgc_petscMapping from XGC vertices to PETSc equation, integer
[out]petsc_xgcMapping from PETSc equation index to XGC vertices, integer
[out]petscloc_xgcMapping from local PETSc eq. to XGC vertices, integer

Here is the call graph for this function:

Here is the caller graph for this function:

subroutine petsc_solver_module::petsc_get_partitioning ( type(grid_type), intent(in)  grid,
integer, dimension(grid%nnode), intent(in)  bc,
type(solver_init_data), intent(in)  solver_data,
integer, intent(in)  comm,
integer, intent(in)  num_pe,
intent(in)  n_eq_tot_in,
logical, intent(in)  set_diffusion_matrix,
dimension(grid%nnode), intent(in)  xgc_petsc,
dimension(grid%nnode), intent(in)  petsc_xgc_bd,
intent(out)  n_eq_loc,
dimension(n_eq_tot_in), intent(out)  xgc_proc_out,
dimension(0:num_pe), intent(out)  proc_eq,
intent(out)  ierr 
)

Generates a domain partitioning of part of the XGC mesh used by a solver (as defined by the boundary conditions bc) using Parmetis and generates a mapping between PETSc equation index and XGC MPI ranks (those in the communicator comm used by the solver).

Parameters
[in]gridXGC solver mesh, type(grid_type)
[in]bcSolver boundary mask, 1 if not in boundary vertex list
[in]commMPI communicator used by the solver, integer
[in]num_peNumber of MPI ranks in communicator comm, integer
[in]n_eq_tot_inTotal number of equations (i.e. number of vertices in the solver), PetscInt
[in]set_diffusion_matrixWhether to set diffusion or Poisson/Ampere matrix
[out]n_eq_locNumber of equations on the current rank, PetscInt
[out]xgc_procMapping from PETSc equation to XGC rank, PetscInt
[out]proc_eqMapping from XGC_process to PETSc equation number (in form of the boundaries of the domain partitioning)
[out]ierrError code

Here is the call graph for this function:

Here is the caller graph for this function:

subroutine petsc_solver_module::petsc_get_scatter_multi_block ( integer, intent(in)  nnode,
intent(in)  ksp,
  mat_all_blocks,
  mat_one_block,
intent(in)  blocksize,
character(*), dimension(0:blocksize-1), intent(in)  varnames,
intent(in)  n_eq_tot,
intent(in)  n_eq_loc,
dimension(n_eq_loc), intent(in)  petscloc_xgc,
dimension(n_eq_tot), intent(in)  petsc_xgc,
  iss,
  to_petsc,
  from_petsc,
intent(out)  ierr 
)

Sets up a PETSc scatter object for a (2D) multi-block field-split solver.

Parameters
[in]nnodeNumber of XGC mesh vertices
[in]kspPETSc Krylov sub-space solver object, KSP
[in]mat_all_blocksGlobal matrix with all solver blocks, Mat
[in]mat_one_blockMatrix of one single block, Mat
[in]blocksizeNumber of blocks, PetscInt
[in]varnamesNames of the variables represented by each block, char
[in]n_eq_totNumber of equations in one block, PetscInt
[in]n_eq_locNumber of local equations in one block, PetscInt
[in]petscloc_xgcRank-local mapping from PETSc equations to XGC vertices, PetscInt
[in]petsc_xgcGlobal mapping from PETSc equations to XGC vertices, PetscInt
[out]issIndex sets for mapping single blocks into the global matrix, IS
[out]from_petscVecScatter object for scattering PETSc results to an XGC field, VecScatter
[out]to_petscVecScatter object for scattering XGC vectors into the global PETSc block matrix
[out]ierrPETSc error code, PetscErrorCode

Here is the call graph for this function:

Here is the caller graph for this function:

subroutine petsc_solver_module::petsc_get_scatter_one_block ( type(xgc_solver)  solver,
integer, intent(in)  nnode,
intent(in)  n_eq_tot,
intent(in)  n_eq_loc,
dimension(n_eq_tot), intent(in)  petsc_xgc,
intent(out)  ierr 
)

Generate PETSc scatter mapping from XGC vertices to PETSc equation for a simple solver with one equation (block). Also sets up the corresponding LHS and RHS PETSc vectors.

Parameters
[in,out]solverXGC solver object
[in]nnodeNumber of XGC mesh vertices
[in]n_eq_totTotal number of PETSc equations, PetscInt
[in]n_eq_locNumber of PETSc equations on this rank
[in]petsc_xgcMapping from PETSc equation index to XGC mesh vertices
[out]ierrPETSc error code

Here is the call graph for this function:

Here is the caller graph for this function:

subroutine petsc_solver_module::petsc_get_sizes ( type(grid_type), intent(in)  grid,
integer, dimension(grid%nnode), intent(in)  bc,
intent(out)  n_equation,
intent(out)  n_boundary,
dimension(grid%nnode), intent(out)  xgc_petsc,
  petsc_xgc_bd 
)

Calculate (i) the number of equations (vertices) of the solver, (ii) the number of XGC boundary vertices included in the solver, (iii) (preliminary) mapping from XGC vertices to PETSc equations, (iv) mapping from XGC vertices to PETSc boundary conditions, all based the XGC boundary object passed as input.

Parameters
[in]gridXGC solver mesh, type(grid_type)
[in]bcSolver boundary mask, 1 if not in boundary vertex list
[out]n_equationNumber of PETSc equations (matrix rows), integer
[out]n_boundaryNumber of boundary conditions/vertices, integer
[out]xgc_petsc(Preliminary) mapping from XGC vertices to PETSc equations
[out]petsc_xgc_bdMapping from XGC vertices to PETSc boundary conditions

Here is the caller graph for this function:

subroutine petsc_solver_module::petsc_get_template_mat ( type(grid_type), intent(in)  grid,
integer, intent(in)  comm,
intent(in)  n_eq_tot,
intent(in)  n_eq_loc,
dimension(grid%nnode), intent(in)  xgc_petsc,
intent(out)  solver_template_mat,
  ierr 
)

Uses pre-computed (petsc_get_partitioning) local matrix sizes and XGC vertex to PETSc equation mapping to set up a blank template matrix with sufficient pre-allocated memory.

Parameters
[in]gridXGC grid object, type(grid_type)
[in]commMPI communicator for template matrix, MPI_comm
[in]n_eq_totTotal number of equations, PetscInt
[in]n_eq_locNumber of local equations on the current rank, PetscInt
[in]xgc_petscMapping from XGC vertices to PETSc equation index
[out]solver_template_matTemplate matrix with the desired partitioning and sufficient preallocated memory.
[out]ierrError code

Here is the call graph for this function:

Here is the caller graph for this function:

subroutine petsc_solver_module::petsc_to_xgc ( intent(in)  nnode,
intent(in)  blocksize,
dimension(0:blocksize-1), intent(in)  from_petsc,
intent(in)  field_petsc,
real (kind=8), dimension(nnode,blocksize), intent(inout)  field_xgc,
intent(out)  ierr 
)

This routine scatters values from the distributed vector compatible with the block-matrix of the 2D diffusion model to local variables on the XGC solver grid.

Parameters
[in]nnodeNumber of vertices per plane in XGC mesh, PetscInt
[in]blocksizeNumber of equations, PetscInt
[in]from_petscVecScatter object for moving data from PETSc Vec to XGC mesh data, VecScatter
[in]field_petscPETSc vector with the data to be scattered, PETSc Vec
[in,out]field_xgcXGC mesh array to which data is scattered, real8
[out]ierrError code, PetscErrorCode

Here is the call graph for this function:

Here is the caller graph for this function:

subroutine petsc_solver_module::petsc_to_xgc_bd ( integer  nn,
type(xgc_solver)  this,
  x,
  xvec 
)

Scatters a PETSc vector to XGC boundary data.

Parameters
[in]nnsize of x, integer
[in]thisXGC solver data object, type(xgc_solver)
[out]xXGC scalar field (defined on boundary vertices)
[in]xvecPetsc vector, Vec

Here is the call graph for this function:

subroutine petsc_solver_module::set_adiabatic_blending_weights ( type(grid_type), intent(in)  grid,
  vec,
dimension(grid%nnode), intent(in)  xgc_petsc,
intent(out)  ierr 
)

Sets values in a PETSc vector according to the blending function \(\alpha(\psi)\), where \(\alpha = 1\) in region 1 for \(\psi \le \psi_{\mathrm{in}}\), \(\alpha = \frac{\psi_{\textrm{out}} - \psi}{\psi_{\textrm{out}} - \psi_{\textrm{in}}}\) in region 1 for \(\psi_{\mathrm{in}} \lt \psi \lt \psi_{\mathrm{out}}\), and \(\alpha = 0\) in region 1 for \(\psi_{\mathrm{out}} \le \psi\) and outside of region 1.

Parameters
[in]gridXGC grid object, type(grid_type)
[in,out]vecPETSc vector to store blending weights, Vec
[in]xgc_petscMapping from XGC vertex index to PETSc equation number, PetscInt
[out]ierrerror code, PetscErrorCode

Here is the call graph for this function:

Here is the caller graph for this function:

subroutine petsc_solver_module::xgc_to_petsc ( intent(in)  nnode,
intent(in)  blocksize,
dimension(0:blocksize-1), intent(in)  to_petsc,
  field_petsc,
real (kind=8), dimension(nnode,blocksize), intent(in)  field_xgc,
intent(out)  ierr 
)

This routine scatters values from local variable on the XGC solver grid to the distributed vector compatible with the block-matrix of the 2D diffusion model.

Parameters
[in]nnodeNumber of vertices per plane in XGC mesh, PetscInt
[in]blocksizeNumber of equations, PetscInt
[in]from_petscVecScatter object for moving data from PETSc Vec to XGC mesh data, VecScatter
[in]field_petscPETSc vector with the data to be scattered, PETSc Vec
[in,out]field_xgcXGC mesh array to which data is scattered, real8
[out]ierrError code, PetscErrorCode

Here is the call graph for this function:

Here is the caller graph for this function:

subroutine petsc_solver_module::xgc_to_petsc_bd ( integer  nn,
type(xgc_solver)  this,
  x,
  xvec 
)

Scatters XGC boundary data to a PETSc vector.

Parameters
[in]nnsize of x, integer
[in]thisXGC solver data object, type(xgc_solver)
[out]xXGC scalar field (defined on boundary vertices)
[in]xvecPetsc vector, Vec

Here is the call graph for this function:

Here is the caller graph for this function:


The documentation for this module was generated from the following file: