FMS 2025.01-dev
Flexible Modeling System
Loading...
Searching...
No Matches
mpp_domains_mod

Domain decomposition and domain update for message-passing codes. More...

Data Types

interface  check_data_size
 Private interface for internal usage, compares two sizes. More...
 
type  contact_type
 Type used to represent the contact between tiles. More...
 
type  domain1d
 One dimensional domain used to manage shared data access between pes. More...
 
type  domain1d_spec
 A private type used to specify index limits for a domain decomposition. More...
 
type  domain2d
 The domain2D type contains all the necessary information to define the global, compute and data domains of each task, as well as the PE associated with the task. The PEs from which remote data may be acquired to update the data domain are also contained in a linked list of neighbours. More...
 
type  domain2d_spec
 Private type to specify multiple index limits and pe information for a 2D domain. More...
 
type  domain_axis_spec
 Used to specify index limits along an axis of a domain. More...
 
type  domaincommunicator2d
 Used for sending domain data between pe's. More...
 
type  domainug
 Domain information for managing data on unstructured grids. More...
 
type  index_type
 index bounds for use in nestSpec More...
 
interface  mpp_broadcast_domain
 Broadcasts domain to every pe. Only useful outside the context of it's own pelist. More...
 
interface  mpp_check_field
 Parallel checking between two ensembles which run on different set pes at the same time
There are two forms for the mpp_check_field call. The 2D version is generally to be used and 3D version is built by repeated calls to the 2D version.

Example usage: More...
 
interface  mpp_complete_do_update
 Private interface used for non blocking updates. More...
 
interface  mpp_complete_group_update
 Completes a pending non-blocking group update Must follow a call to mpp_start_group_update. More...
 
interface  mpp_complete_update_domains
 Must be used after a call to mpp_start_update_domains in order to complete a nonblocking domain update. See mpp_start_update_domains for more info. More...
 
interface  mpp_copy_domain
 Copy 1D or 2D domain. More...
 
interface  mpp_create_group_update
 Constructor for the mpp_group_update_type which is then used with mpp_start_group_update. More...
 
interface  mpp_deallocate_domain
 Deallocate given 1D or 2D domain. More...
 
interface  mpp_define_domains
 Set up a domain decomposition. More...
 
interface  mpp_define_layout
 Retrieve layout associated with a domain decomposition. Given a global 2D domain and the number of divisions in the decomposition ndivs (usually the PE count unless some domains are masked) this calls returns a 2D domain layout. By default, mpp_define_layout will attempt to divide the 2D index space into domains that maintain the aspect ratio of the global domain. If this cannot be done, the algorithm favours domains that are longer in x than y, a preference that could improve vector performance.
Example usage: More...
 
interface  mpp_define_null_domain
 Defines a nullified 1D or 2D domain. More...
 
interface  mpp_do_check
 Private interface to updates data domain of 3D field whose computational domains have been computed. More...
 
interface  mpp_do_get_boundary
 
interface  mpp_do_get_boundary_ad
 
interface  mpp_do_global_field
 Private helper interface used by mpp_global_field. More...
 
interface  mpp_do_global_field_ad
 
interface  mpp_do_group_update
 
interface  mpp_do_redistribute
 
interface  mpp_do_update
 Private interface used for mpp_update_domains. More...
 
interface  mpp_do_update_ad
 Passes a data field from a unstructured grid to an structured grid
Example usage: More...
 
interface  mpp_do_update_nest_coarse
 Used by mpp_update_nest_coarse to perform domain updates. More...
 
interface  mpp_do_update_nest_fine
 
interface  mpp_get_boundary
 Get the boundary data for symmetric domain when the data is at C, E, or N-cell center.
mpp_get_boundary is used to get the boundary data for symmetric domain when the data is at C, E, or N-cell center. For cubic grid, the data should always at C-cell center.
Example usage: More...
 
interface  mpp_get_boundary_ad
 
interface  mpp_get_compute_domain
 These routines retrieve the axis specifications associated with the compute domains. The domain is a derived type with private elements. These routines retrieve the axis specifications associated with the compute domains The 2D version of these is a simple extension of 1D.
Example usage: More...
 
interface  mpp_get_compute_domains
 Retrieve the entire array of compute domain extents associated with a decomposition. More...
 
interface  mpp_get_data_domain
 These routines retrieve the axis specifications associated with the data domains. The domain is a derived type with private elements. These routines retrieve the axis specifications associated with the data domains. The 2D version of these is a simple extension of 1D.
Example usage: More...
 
interface  mpp_get_domain_extents
 
interface  mpp_get_f2c_index
 Get the index of the data passed from fine grid to coarse grid.
Example usage: More...
 
interface  mpp_get_global_domain
 These routines retrieve the axis specifications associated with the global domains. The domain is a derived type with private elements. These routines retrieve the axis specifications associated with the global domains. The 2D version of these is a simple extension of 1D.
Example usage: More...
 
interface  mpp_get_global_domains
 
interface  mpp_get_layout
 Retrieve layout associated with a domain decomposition The 1D version of this call returns the number of divisions that was assigned to this decomposition axis. The 2D version of this call returns an array of dimension 2 holding the results on two axes.
Example usage: More...
 
interface  mpp_get_memory_domain
 These routines retrieve the axis specifications associated with the memory domains. The domain is a derived type with private elements. These routines retrieve the axis specifications associated with the memory domains. The 2D version of these is a simple extension of 1D.
Example usage: More...
 
interface  mpp_get_neighbor_pe
 Retrieve PE number of a neighboring domain. More...
 
interface  mpp_get_pelist
 Retrieve list of PEs associated with a domain decomposition. The 1D version of this call returns an array of the PEs assigned to this 1D domain decomposition. In addition the optional argument pos may be used to retrieve the 0-based position of the domain local to the calling PE, i.e., domain%list(pos)%pe is the local PE, as returned by mpp_pe(). The 2D version of this call is identical to 1D version. More...
 
interface  mpp_global_field
 Fill in a global array from domain-decomposed arrays.
More...
 
interface  mpp_global_field_ad
 
interface  mpp_global_field_ug
 Same functionality as mpp_global_field but for unstructured domains. More...
 
interface  mpp_global_max
 Global max of domain-decomposed arrays.
mpp_global_max is used to get the maximum value of a domain-decomposed array on each PE. MPP_TYPE_can be of type integer or real; of 4-byte or 8-byte kind; of rank up to 5. The dimension of locus must equal the rank of field.

All PEs in a domain decomposition must call mpp_global_max, and each will have the result upon exit. The function mpp_global_min, with an identical syntax. is also available. More...
 
interface  mpp_global_min
 Global min of domain-decomposed arrays.
mpp_global_min is used to get the minimum value of a domain-decomposed array on each PE. MPP_TYPE_can be of type integer or real; of 4-byte or 8-byte kind; of rank up to 5. The dimension of locus must equal the rank of field.

All PEs in a domain decomposition must call mpp_global_min, and each will have the result upon exit. The function mpp_global_max, with an identical syntax. is also available. More...
 
interface  mpp_global_sum
 Global sum of domain-decomposed arrays.
mpp_global_sum is used to get the sum of a domain-decomposed array on each PE. MPP_TYPE_ can be of type integer, complex, or real; of 4-byte or 8-byte kind; of rank up to 5. More...
 
interface  mpp_global_sum_ad
 
interface  mpp_global_sum_tl
 
type  mpp_group_update_type
 used for updates on a group More...
 
interface  mpp_modify_domain
 Modifies the extents (compute, data and global) of a given domain. More...
 
interface  mpp_nullify_domain_list
 Nullify domain list. This interface is needed in mpp_domains_test. 1-D case can be added in if needed.
Example usage: More...
 
interface  mpp_pass_sg_to_ug
 Passes data from a structured grid to an unstructured grid
Example usage: More...
 
interface  mpp_pass_ug_to_sg
 Passes a data field from a structured grid to an unstructured grid
Example usage: More...
 
interface  mpp_redistribute
 Reorganization of distributed global arrays.
mpp_redistribute is used to reorganize a distributed array. MPP_TYPE_can be of type integer, complex, or real; of 4-byte or 8-byte kind; of rank up to 5.
Example usage: call mpp_redistribute( domain_in, field_in, domain_out, field_out ) More...
 
interface  mpp_reset_group_update_field
 
interface  mpp_set_compute_domain
 These routines set the axis specifications associated with the compute domains. The domain is a derived type with private elements. These routines set the axis specifications associated with the compute domains The 2D version of these is a simple extension of 1D.
Example usage: More...
 
interface  mpp_set_data_domain
 These routines set the axis specifications associated with the data domains. The domain is a derived type with private elements. These routines set the axis specifications associated with the data domains. The 2D version of these is a simple extension of 1D.
Example usage: More...
 
interface  mpp_set_global_domain
 These routines set the axis specifications associated with the global domains. The domain is a derived type with private elements. These routines set the axis specifications associated with the global domains. The 2D version of these is a simple extension of 1D.
Example usage: More...
 
interface  mpp_start_do_update
 Private interface used for non blocking updates. More...
 
interface  mpp_start_group_update
 Starts non-blocking group update Must be followed up with a call to mpp_complete_group_update mpp_group_update_type can be created with mpp_create_group_update. More...
 
interface  mpp_start_update_domains
 Interface to start halo updates mpp_start_update_domains is used to start a halo update of a domain-decomposed array on each PE. MPP_TYPE_ can be of type complex, integer, logical or real; of 4-byte or 8-byte kind; of rank up to 5. The vector version (with two input data fields) is only present for \ereal types.

\empp_start_update_domains must be paired together with \empp_complete_update_domains. In mpp_start_update_domains, a buffer will be pre-post to receive (non-blocking) the data and data on computational domain will be packed and sent (non-blocking send) to other processor. In mpp_complete_update_domains, buffer will be unpacked to fill the halo and mpp_sync_self will be called to to ensure communication safe at the last call of mpp_complete_update_domains.

Each mpp_update_domains can be replaced by the combination of mpp_start_update_domains and mpp_complete_update_domains. The arguments in mpp_start_update_domains and mpp_complete_update_domains should be the exact the same as in mpp_update_domains to be replaced except no optional argument "complete". The following are examples on how to replace mpp_update_domains with mpp_start_update_domains/mpp_complete_update_domains. More...
 
interface  mpp_update_domains
 Performs halo updates for a given domain.
More...
 
interface  mpp_update_domains_ad
 Similar to mpp_update_domains , updates adjoint domains. More...
 
interface  mpp_update_nest_coarse
 Pass the data from fine grid to fill the buffer to be ready to be interpolated onto coarse grid.
Example usage: More...
 
interface  mpp_update_nest_fine
 Pass the data from coarse grid to fill the buffer to be ready to be interpolated onto fine grid.
Example usage: More...
 
type  nest_domain_type
 domain with nested fine and course tiles More...
 
type  nest_level_type
 Private type to hold data for each level of nesting. More...
 
type  nestspec
 Used to specify bounds and index information for nested tiles as a linked list. More...
 
type  nonblock_type
 Used for nonblocking data transfer. More...
 
interface  operator(.eq.)
 Equality/inequality operators for domaintypes.
More...
 
interface  operator(.ne.)
 
type  overlap_type
 Type for overlapping data. More...
 
type  overlapspec
 Private type for overlap specifications. More...
 
type  tile_type
 Upper and lower x and y bounds for a tile. More...
 
type  unstruct_axis_spec
 Private type for axis specification data for an unstructured grid. More...
 
type  unstruct_domain_spec
 Private type for axis specification data for an unstructured domain. More...
 
type  unstruct_overlap_type
 Private type. More...
 
type  unstruct_pass_type
 Private type. More...
 

Functions/Subroutines

subroutine add_check_overlap (overlap_out, overlap_in)
 this routine adds the overlap_in into overlap_out
 
subroutine add_update_overlap (overlap_out, overlap_in)
 
subroutine allocate_check_overlap (overlap, count)
 
subroutine allocate_nest_overlap (overlap, count)
 
subroutine allocate_update_overlap (overlap, count)
 
subroutine apply_cyclic_offset (lstart, lend, offset, gstart, gend, gsize)
 add offset to the index
 
subroutine check_alignment (is, ie, js, je, isg, ieg, jsg, jeg, alignment)
 
subroutine check_data_size_1d (module, str1, size1, str2, size2)
 
subroutine check_data_size_2d (module, str1, isize1, jsize1, str2, isize2, jsize2)
 
subroutine check_message_size (domain, update, send, recv, position)
 
subroutine check_overlap_pe_order (domain, overlap, name)
 
subroutine compute_overlap_coarse_to_fine (nest_domain, overlap, extra_halo, position, name)
 
subroutine compute_overlap_fine_to_coarse (nest_domain, overlap, position, name)
 This routine will compute the send and recv information between overlapped nesting region. The data is assumed on T-cell center.
 
subroutine compute_overlaps (domain, position, update, check, ishift, jshift, x_cyclic_offset, y_cyclic_offset, whalo, ehalo, shalo, nhalo)
 Computes remote domain overlaps.
 
subroutine compute_overlaps_fold_east (domain, position, ishift, jshift)
 computes remote domain overlaps assumes only one in each direction will calculate the overlapping for T,E,C,N-cell seperately. here assume fold-east and y-cyclic boundary condition
 
subroutine compute_overlaps_fold_south (domain, position, ishift, jshift)
 Computes remote domain overlaps assumes only one in each direction will calculate the overlapping for T,E,C,N-cell seperately.
 
subroutine compute_overlaps_fold_west (domain, position, ishift, jshift)
 Computes remote domain overlaps assumes only one in each direction will calculate the overlapping for T,E,C,N-cell seperately.
 
subroutine convert_index_back (domain, ishift, jshift, rotate, is_in, ie_in, js_in, je_in, is_out, ie_out, js_out, je_out)
 
integer function convert_index_to_coarse (domain, ishift, jshift, tile_coarse, istart_coarse, iend_coarse, jstart_coarse, jend_coarse, ntiles_coarse, tile_in, is_in, ie_in, js_in, je_in, is_out, ie_out, js_out, je_out, rotate_out)
 
integer function convert_index_to_nest (domain, ishift, jshift, tile_coarse, istart_coarse, iend_coarse, jstart_coarse, jend_coarse, ntiles_coarse, tile_in, is_in, ie_in, js_in, je_in, is_out, ie_out, js_out, je_out, rotate_out)
 This routine will convert the global coarse grid index to nest grid index.
 
subroutine copy_nest_overlap (overlap_out, overlap_in)
 
subroutine deallocate_comm (d_comm)
 
subroutine deallocate_domain2d_local (domain)
 
subroutine deallocate_nest_overlap (overlap)
 
subroutine deallocate_overlap_type (overlap)
 
subroutine deallocate_overlapspec (overlap)
 
subroutine debug_message_size (overlap, name)
 
subroutine define_contact_point (domain, position, num_contact, tile1, tile2, align1, align2, refine1, refine2, istart1, iend1, jstart1, jend1, istart2, iend2, jstart2, jend2, isglist, ieglist, jsglist, jeglist)
 compute the overlapping between tiles for the T-cell.
 
subroutine define_nest_level_type (nest_domain, x_refine, y_refine, extra_halo)
 
subroutine expand_check_overlap_list (overlaplist, npes)
 
subroutine expand_update_overlap_list (overlaplist, npes)
 
subroutine fill_contact (contact, tile, is1, ie1, js1, je1, is2, ie2, js2, je2, align1, align2, refine1, refine2)
 always fill the contact according to index order.
 
subroutine fill_corner_contact (econt, scont, wcont, ncont, isg, ieg, jsg, jeg, numr, nums, tilerecv, tilesend, is1recv, ie1recv, js1recv, je1recv, is2recv, ie2recv, js2recv, je2recv, is1send, ie1send, js1send, je1send, is2send, ie2send, js2send, je2send, align1recv, align2recv, align1send, align2send, whalo, ehalo, shalo, nhalo, tileme)
 
subroutine fill_overlap (overlap, domain, m, is, ie, js, je, isc, iec, jsc, jec, isg, ieg, jsg, jeg, dir, reverse, symmetry)
 
subroutine fill_overlap_recv_fold (overlap, domain, m, is, ie, js, je, isd, ied, jsd, jed, isg, ieg, dir, ishift, position, ioff, middle, symmetry)
 
subroutine fill_overlap_recv_nofold (overlap, domain, m, is, ie, js, je, isd, ied, jsd, jed, isg, ieg, dir, ioff, is_cyclic, folded, symmetry)
 
subroutine fill_overlap_send_fold (overlap, domain, m, is, ie, js, je, isc, iec, jsc, jec, isg, ieg, dir, ishift, position, ioff, middle, symmetry)
 
subroutine fill_overlap_send_nofold (overlap, domain, m, is, ie, js, je, isc, iec, jsc, jec, isg, ieg, dir, ioff, is_cyclic, folded, symmetry)
 
integer function find_index (array, index_data, start_pos)
 
integer function find_key (key, sorted, insert)
 
subroutine free_comm (domain_id, l_addr, l_addr2)
 
subroutine get_coarse_index (rotate, is, ie, js, je, iadd, jadd, is_c, ie_c, js_c, je_c)
 
type(domaincommunicator2d) function, pointer get_comm (domain_id, l_addr, l_addr2)
 
subroutine get_fold_index_east (jsg, jeg, ieg, jshift, position, is, ie, js, je)
 
subroutine get_fold_index_north (isg, ieg, jeg, ishift, position, is, ie, js, je)
 
subroutine get_fold_index_south (isg, ieg, jsg, ishift, position, is, ie, js, je)
 
subroutine get_fold_index_west (jsg, jeg, isg, jshift, position, is, ie, js, je)
 
integer function get_nest_vector_recv (nest_domain, update_x, update_y, ind_x, ind_y, start_pos, pelist)
 
integer function get_nest_vector_send (nest_domain, update_x, update_y, ind_x, ind_y, start_pos, pelist)
 
subroutine get_nnest (domain, num_nest, tile_coarse, istart_coarse, iend_coarse, jstart_coarse, jend_coarse, x_refine, y_refine, nnest, t_coarse, ncross_coarse, rotate_coarse, is_coarse, ie_coarse, js_coarse, je_coarse, is_fine, ie_fine, js_fine, je_fine)
 
subroutine init_index_type (indexdata)
 
subroutine init_overlap_type (overlap)
 
subroutine insert_check_overlap (overlap, pe, tileme, dir, rotation, is, ie, js, je)
 
subroutine insert_nest_overlap (overlap, pe, is, ie, js, je, dir, rotation)
 
subroutine insert_overlap_type (overlap, pe, tileme, tilenbr, is, ie, js, je, dir, rotation, from_contact)
 
subroutine insert_update_overlap (overlap, pe, is1, ie1, js1, je1, is2, ie2, js2, je2, dir, reverse, symmetry)
 
subroutine mpp_compute_block_extent (isg, ieg, ndivs, ibegin, iend)
 Computes the extents of a grid block.
 
subroutine mpp_compute_extent (isg, ieg, ndivs, ibegin, iend, extent)
 Computes extents for a grid decomposition with the given indices and divisions.
 
subroutine mpp_deallocate_domain1d (domain)
 
subroutine mpp_deallocate_domain2d (domain)
 
subroutine mpp_define_domains1d (global_indices, ndivs, domain, pelist, flags, halo, extent, maskmap, memory_size, begin_halo, end_halo)
 Define data and computational domains on a 1D set of data (isg:ieg) and assign them to PEs.
 
subroutine mpp_define_domains2d (global_indices, layout, domain, pelist, xflags, yflags, xhalo, yhalo, xextent, yextent, maskmap, name, symmetry, memory_size, whalo, ehalo, shalo, nhalo, is_mosaic, tile_count, tile_id, complete, x_cyclic_offset, y_cyclic_offset)
 Define 2D data and computational domain on global rectilinear cartesian domain (isg:ieg,jsg:jeg) and assign them to PEs.
 
subroutine mpp_define_io_domain (domain, io_layout)
 Define the layout for IO pe's for the given domain.
 
subroutine mpp_define_layout2d (global_indices, ndivs, layout)
 
subroutine mpp_define_mosaic (global_indices, layout, domain, num_tile, num_contact, tile1, tile2, istart1, iend1, jstart1, jend1, istart2, iend2, jstart2, jend2, pe_start, pe_end, pelist, whalo, ehalo, shalo, nhalo, xextent, yextent, maskmap, name, memory_size, symmetry, xflags, yflags, tile_id)
 Defines a domain for mosaic tile grids.
 
subroutine mpp_define_mosaic_pelist (sizes, pe_start, pe_end, pelist, costpertile)
 Defines a pelist for use with mosaic tiles.
 
subroutine mpp_define_nest_domains (nest_domain, domain, num_nest, nest_level, tile_fine, tile_coarse, istart_coarse, icount_coarse, jstart_coarse, jcount_coarse, npes_nest_tile, x_refine, y_refine, extra_halo, name)
 Set up a domain to pass data between aligned coarse and fine grid of nested model.
 
subroutine mpp_define_null_domain1d (domain)
 
subroutine mpp_define_null_domain2d (domain)
 
subroutine mpp_get_c2f_index (nest_domain, is_fine, ie_fine, js_fine, je_fine, is_coarse, ie_coarse, js_coarse, je_coarse, dir, nest_level, position)
 Get the index of the data passed from coarse grid to fine grid.
 
subroutine mpp_get_f2c_index_coarse (nest_domain, is_coarse, ie_coarse, js_coarse, je_coarse, nest_level, position)
 
subroutine mpp_get_f2c_index_fine (nest_domain, is_coarse, ie_coarse, js_coarse, je_coarse, is_fine, ie_fine, js_fine, je_fine, nest_level, position)
 
type(domain2d) function, pointer mpp_get_nest_coarse_domain (nest_domain, nest_level)
 
type(domain2d) function, pointer mpp_get_nest_fine_domain (nest_domain, nest_level)
 
integer function mpp_get_nest_fine_npes (nest_domain, nest_level)
 
subroutine mpp_get_nest_fine_pelist (nest_domain, nest_level, pelist)
 
integer function mpp_get_nest_npes (nest_domain, nest_level)
 
subroutine mpp_get_nest_pelist (nest_domain, nest_level, pelist)
 
subroutine mpp_global_field_free_comm (domain, l_addr, ksize, l_addr2, flags)
 
type(domaincommunicator2d) function, pointer mpp_global_field_init_comm (domain, l_addr, isize_g, jsize_g, isize_l, jsize_l, ksize, l_addr2, flags, position)
 initializes a DomainCommunicator2D type for use in mpp_global_field
 
logical function mpp_is_nest_coarse (nest_domain, nest_level)
 
logical function mpp_is_nest_fine (nest_domain, nest_level)
 
subroutine mpp_modify_domain1d (domain_in, domain_out, cbegin, cend, gbegin, gend, hbegin, hend)
 Modifies the exents of a domain.
 
subroutine mpp_modify_domain2d (domain_in, domain_out, isc, iec, jsc, jec, isg, ieg, jsg, jeg, whalo, ehalo, shalo, nhalo)
 
logical function mpp_mosaic_defined ()
 Accessor function for value of mosaic_defined.
 
subroutine mpp_redistribute_free_comm (domain_in, l_addr, domain_out, l_addr2, ksize, lsize)
 
type(domaincommunicator2d) function, pointer mpp_redistribute_init_comm (domain_in, l_addrs_in, domain_out, l_addrs_out, isize_in, jsize_in, ksize_in, isize_out, jsize_out, ksize_out)
 
subroutine mpp_shift_nest_domains (nest_domain, domain, delta_i_coarse, delta_j_coarse, extra_halo)
 Based on mpp_define_nest_domains, but just resets positioning of nest Modifies the parent/coarse start and end indices of the nest location Computes new overlaps of nest PEs on parent PEs Ramstrom/HRD Moving Nest.
 
subroutine pop_key (sorted, idx, n_idx, key_idx)
 
subroutine print_nest_overlap (overlap, msg)
 
integer function push_key (sorted, idx, n_idx, insert, key, ival)
 
type(nestspec) function, pointer search_c2f_nest_overlap (nest_domain, nest_level, extra_halo, position)
 
type(nestspec) function, pointer search_f2c_nest_overlap (nest_domain, nest_level, position)
 
subroutine set_bound_overlap (domain, position)
 set up the overlapping for boundary if the domain is symmetry.
 
subroutine set_check_overlap (domain, position)
 set up the overlapping for boundary check if the domain is symmetry. The check will be done on current pe for east boundary for E-cell, north boundary for N-cell, East and North boundary for C-cell
 
subroutine set_contact_point (domain, position)
 this routine sets the overlapping between tiles for E,C,N-cell based on T-cell overlapping
 
subroutine set_domain_comm_inf (update)
 
integer(i8_kind) function set_domain_id (d_id, ksize, flags, gtype, position, whalo, ehalo, shalo, nhalo)
 
subroutine set_overlaps (domain, overlap_in, overlap_out, whalo_out, ehalo_out, shalo_out, nhalo_out)
 this routine sets up the overlapping for mpp_update_domains for arbitrary halo update. should be the halo size defined in mpp_define_domains. xhalo_out, yhalo_out should not be exactly the same as xhalo_in, yhalo_in currently we didn't consider about tripolar grid situation, because in the folded north region, the overlapping is specified through list of points, not through rectangular. But will return back to solve this problem in the future.
 
subroutine set_single_overlap (overlap_in, overlap_out, isoff, ieoff, jsoff, jeoff, index, dir, rotation)
 

Variables

integer, save a2_sort_len =0
 length sorted memory list
 
integer, save a_sort_len =0
 length sorted memory list
 
integer(i8_kind), parameter addr2_base = 65536_i8_kind
 = 0x0000000000010000
 
integer, dimension(-1:max_addrs2), save addrs2_idx =-9999
 index of addr2 associated with d_comm
 
integer(i8_kind), dimension(max_addrs2), save addrs2_sorted =-9999
 list of sorted local addresses
 
integer, dimension(-1:max_addrs), save addrs_idx =-9999
 index of address associated with d_comm
 
integer(i8_kind), dimension(max_addrs), save addrs_sorted =-9999
 list of sorted local addresses
 
logical complete_group_update_on = .false.
 
logical complete_update = .false.
 
integer current_id_update = 0
 
type(domaincommunicator2d), dimension(:), allocatable, target, save d_comm
 domain communicators
 
integer, dimension(-1:max_fields), save d_comm_idx =-9999
 index of d_comm associated with sorted addresses
 
integer, save dc_sort_len =0
 length sorted comm keys (=num active communicators)
 
integer(i8_kind), dimension(max_fields), save dckey_sorted =-9999
 list of sorted local addresses
 
logical debug = .FALSE.
 
logical debug_message_passing = .false.
 Will check the consistency on the boundary between processor/tile when updating domain for symmetric domain and check the consistency on the north folded edge.
 
character(len=32) debug_update_domain = "none"
 namelist interface
 
integer debug_update_level = NO_CHECK
 
logical domain_clocks_on =.FALSE.
 
integer(i8_kind) domain_cnt =0
 
logical efp_sum_overflow_check = .false.
 If .true., always do overflow_check when doing EFP bitwise mpp_global_sum.
 
integer, parameter field_s = 0
 
integer, parameter field_x = 1
 
integer, parameter field_y = 2
 
integer group_pack_clock =0
 
integer group_recv_clock =0
 
integer group_send_clock =0
 
integer group_unpk_clock =0
 
integer group_update_buffer_pos = 0
 
integer group_wait_clock =0
 
integer(i8_kind), parameter gt_base = 256_i8_kind
 
integer, save i_sort_len =0
 length sorted domain ids list
 
integer, dimension(-1:max_dom_ids), save ids_idx =-9999
 index of d_comm associated with sorted addesses
 
integer(i8_kind), dimension(max_dom_ids), save ids_sorted =-9999
 list of sorted domain identifiers
 
integer(i8_kind), parameter ke_base = 281474976710656_i8_kind
 
integer, parameter max_addrs =512
 
integer, parameter max_addrs2 =128
 
integer, parameter max_dom_ids =128
 
integer, parameter max_fields =1024
 
integer, parameter max_nonblock_update = 100
 
integer, parameter maxlist = 100
 
integer, parameter maxoverlap = 200
 
logical module_is_initialized = .false.
 
logical mosaic_defined = .false.
 
integer mpp_domains_stack_hwm =0
 
integer mpp_domains_stack_size =0
 
integer, save n_addrs =0
 number of memory addresses used
 
integer, save n_addrs2 =0
 number of memory addresses used
 
integer, save n_comm =0
 number of communicators used
 
integer, save n_ids =0
 number of domain ids used (=i_sort_len; domain ids are never removed)
 
integer nest_pack_clock =0
 
integer nest_recv_clock =0
 
integer nest_send_clock =0
 
integer nest_unpk_clock =0
 
integer nest_wait_clock =0
 
integer, parameter no_check = -1
 
integer nonblock_buffer_pos = 0
 
type(nonblock_type), dimension(:), allocatable nonblock_data
 
integer nonblock_group_buffer_pos = 0
 
integer nonblock_group_pack_clock =0
 
integer nonblock_group_recv_clock =0
 
integer nonblock_group_send_clock =0
 
integer nonblock_group_unpk_clock =0
 
integer nonblock_group_wait_clock =0
 
integer nthread_control_loop = 8
 Determine the loop order for packing and unpacking. When number of threads is greater than nthread_control_loop, the k-loop will be moved outside and combined with number of pack and unpack. When the number of threads is less than or equal to nthread_control_loop, the k-loop is moved inside, but still outside, of j,i loop.
 
type(domain1d), save, public null_domain1d
 
type(domain2d), save, public null_domain2d
 
type(domainug), save, public null_domainug
 
integer num_nonblock_group_update = 0
 
integer num_update = 0
 
integer pack_clock =0
 
integer pe
 
integer recv_clock =0
 
integer recv_clock_nonblock =0
 
integer send_clock =0
 
integer send_pack_clock_nonblock =0
 
logical start_update = .true.
 
integer unpk_clock =0
 
integer unpk_clock_nonblock =0
 
logical use_alltoallw = .false.
 
logical verbose =.FALSE.
 
integer wait_clock =0
 
integer wait_clock_nonblock =0
 
subroutine mpp_domains_set_stack_size (n)
 Set user stack size.
 
logical function mpp_domain1d_eq (a, b)
 Set user stack size.
 
logical function mpp_domain1d_ne (a, b)
 Set user stack size.
 
logical function mpp_domain2d_eq (a, b)
 Set user stack size.
 
logical function mpp_domain2d_ne (a, b)
 Set user stack size.
 
subroutine mpp_get_compute_domain1d (domain, begin, end, size, max_size, is_global)
 Set user stack size.
 
subroutine mpp_get_data_domain1d (domain, begin, end, size, max_size, is_global)
 Set user stack size.
 
subroutine mpp_get_global_domain1d (domain, begin, end, size, max_size)
 Set user stack size.
 
subroutine mpp_get_memory_domain1d (domain, begin, end, size, max_size, is_global)
 Set user stack size.
 
subroutine mpp_get_compute_domain2d (domain, xbegin, xend, ybegin, yend, xsize, xmax_size, ysize, ymax_size, x_is_global, y_is_global, tile_count, position)
 Set user stack size.
 
subroutine mpp_get_data_domain2d (domain, xbegin, xend, ybegin, yend, xsize, xmax_size, ysize, ymax_size, x_is_global, y_is_global, tile_count, position)
 Set user stack size.
 
subroutine mpp_get_global_domain2d (domain, xbegin, xend, ybegin, yend, xsize, xmax_size, ysize, ymax_size, tile_count, position)
 Set user stack size.
 
subroutine mpp_get_memory_domain2d (domain, xbegin, xend, ybegin, yend, xsize, xmax_size, ysize, ymax_size, x_is_global, y_is_global, position)
 Set user stack size.
 
subroutine mpp_set_super_grid_indices (grid)
 Modifies the indices in the domain_axis_spec type to those of the supergrid.
 
subroutine mpp_create_super_grid_domain (domain)
 Modifies the indices of the input domain to create the supergrid domain.
 
subroutine mpp_set_compute_domain1d (domain, begin, end, size, is_global)
 Set user stack size.
 
subroutine mpp_set_compute_domain2d (domain, xbegin, xend, ybegin, yend, xsize, ysize, x_is_global, y_is_global, tile_count)
 Set user stack size.
 
subroutine mpp_set_data_domain1d (domain, begin, end, size, is_global)
 Set user stack size.
 
subroutine mpp_set_data_domain2d (domain, xbegin, xend, ybegin, yend, xsize, ysize, x_is_global, y_is_global, tile_count)
 Set user stack size.
 
subroutine mpp_set_global_domain1d (domain, begin, end, size)
 Set user stack size.
 
subroutine mpp_set_global_domain2d (domain, xbegin, xend, ybegin, yend, xsize, ysize, tile_count)
 Set user stack size.
 
subroutine mpp_get_domain_components (domain, x, y, tile_count)
 Retrieve 1D components of 2D decomposition.
 
subroutine mpp_get_compute_domains1d (domain, begin, end, size)
 Set user stack size.
 
subroutine mpp_get_compute_domains2d (domain, xbegin, xend, xsize, ybegin, yend, ysize, position)
 Set user stack size.
 
subroutine mpp_get_global_domains1d (domain, begin, end, size)
 Set user stack size.
 
subroutine mpp_get_global_domains2d (domain, xbegin, xend, xsize, ybegin, yend, ysize, position)
 Set user stack size.
 
subroutine mpp_get_domain_extents1d (domain, xextent, yextent)
 Set user stack size.
 
subroutine mpp_get_domain_extents2d (domain, xextent, yextent)
 This will return xextent and yextent for each tile.
 
integer function mpp_get_domain_pe (domain)
 Set user stack size.
 
integer function mpp_get_domain_tile_root_pe (domain)
 Set user stack size.
 
integer function mpp_get_domain_tile_commid (domain)
 Set user stack size.
 
integer function mpp_get_domain_commid (domain)
 Set user stack size.
 
type(domain2d) function, pointer mpp_get_io_domain (domain)
 Set user stack size.
 
subroutine mpp_get_pelist1d (domain, pelist, pos)
 Set user stack size.
 
subroutine mpp_get_pelist2d (domain, pelist, pos)
 Set user stack size.
 
subroutine mpp_get_layout1d (domain, layout)
 Set user stack size.
 
subroutine mpp_get_layout2d (domain, layout)
 Set user stack size.
 
subroutine mpp_get_domain_shift (domain, ishift, jshift, position)
 Returns the shift value in x and y-direction according to domain position..
 
subroutine mpp_get_neighbor_pe_1d (domain, direction, pe)
 Return PE to the righ/left of this PE-domain.
 
subroutine mpp_get_neighbor_pe_2d (domain, direction, pe)
 Return PE North/South/East/West of this PE-domain. direction must be NORTH, SOUTH, EAST or WEST.
 
subroutine nullify_domain2d_list (domain)
 Set user stack size.
 
logical function mpp_domain_is_symmetry (domain)
 Set user stack size.
 
logical function mpp_domain_is_initialized (domain)
 Set user stack size.
 
logical function domain_update_is_needed (domain, whalo, ehalo, shalo, nhalo)
 Set user stack size.
 
type(overlapspec) function, pointer search_update_overlap (domain, whalo, ehalo, shalo, nhalo, position)
 this routine found the domain has the same halo size with the input whalo, ehalo,
 
type(overlapspec) function, pointer search_check_overlap (domain, position)
 this routine finds the check at certain position
 
type(overlapspec) function, pointer search_bound_overlap (domain, position)
 This routine finds the bound at certain position.
 
integer function, dimension(size(domain%tile_id(:))) mpp_get_tile_id (domain)
 Returns the tile_id on current pe.
 
subroutine mpp_get_tile_list (domain, tiles)
 Return the tile_id on current pelist. one-tile-per-pe is assumed.
 
integer function mpp_get_ntile_count (domain)
 Returns number of tiles in mosaic.
 
integer function mpp_get_current_ntile (domain)
 Returns number of tile on current pe.
 
logical function mpp_domain_is_tile_root_pe (domain)
 Returns if current pe is the root pe of the tile, if number of tiles on current pe is greater than 1, will return true, if isc==isg and jsc==jsg also will return true, otherwise false will be returned.
 
integer function mpp_get_tile_npes (domain)
 Returns number of processors used on current tile.
 
subroutine mpp_get_tile_pelist (domain, pelist)
 Get the processors list used on current tile.
 
subroutine mpp_get_tile_compute_domains (domain, xbegin, xend, ybegin, yend, position)
 Set user stack size.
 
integer function mpp_get_num_overlap (domain, action, p, position)
 Set user stack size.
 
subroutine mpp_get_update_size (domain, nsend, nrecv, position)
 Set user stack size.
 
subroutine mpp_get_update_pelist (domain, action, pelist, position)
 Set user stack size.
 
subroutine mpp_get_overlap (domain, action, p, is, ie, js, je, dir, rot, position)
 Set user stack size.
 
character(len=name_length) function mpp_get_domain_name (domain)
 Set user stack size.
 
integer function mpp_get_domain_root_pe (domain)
 Set user stack size.
 
integer function mpp_get_domain_npes (domain)
 Set user stack size.
 
subroutine mpp_get_domain_pelist (domain, pelist)
 Set user stack size.
 
integer function, dimension(2) mpp_get_io_domain_layout (domain)
 Set user stack size.
 
integer function get_rank_send (domain, overlap_x, overlap_y, rank_x, rank_y, ind_x, ind_y)
 Set user stack size.
 
integer function get_rank_recv (domain, overlap_x, overlap_y, rank_x, rank_y, ind_x, ind_y)
 Set user stack size.
 
integer function get_vector_recv (domain, update_x, update_y, ind_x, ind_y, start_pos, pelist)
 Set user stack size.
 
integer function get_vector_send (domain, update_x, update_y, ind_x, ind_y, start_pos, pelist)
 Set user stack size.
 
integer function get_rank_unpack (domain, overlap_x, overlap_y, rank_x, rank_y, ind_x, ind_y)
 Set user stack size.
 
integer function get_mesgsize (overlap, do_dir)
 Set user stack size.
 
subroutine mpp_set_domain_symmetry (domain, symmetry)
 Set user stack size.
 
recursive subroutine mpp_copy_domain1d (domain_in, domain_out)
 Copies input 1d domain to the output 1d domain.
 
subroutine mpp_copy_domain2d (domain_in, domain_out)
 Copies input 2d domain to the output 2d domain.
 
subroutine mpp_copy_domain2d_spec (domain2d_spec_in, domain2d_spec_out)
 Copies input 2d domain spec to the output 2d domain spec.
 
subroutine mpp_copy_domain1d_spec (domain1d_spec_in, domain1d_spec_out)
 Copies input 1d domain spec to the output 1d domain spec.
 
subroutine mpp_copy_domain_axis_spec (domain_axis_spec_in, domain_axis_spec_out)
 Copies input domain_axis_spec to the output domain_axis_spec.
 
subroutine set_group_update (group, domain)
 Set user stack size.
 
subroutine mpp_clear_group_update (group)
 Set user stack size.
 
logical function mpp_group_update_initialized (group)
 Set user stack size.
 
logical function mpp_group_update_is_set (group)
 Set user stack size.
 

Detailed Description

Domain decomposition and domain update for message-passing codes.

Instantiates a layout with the given indices and divisions.

Author
V. Balaji SGI/GFDL Princeton University

A set of simple calls for domain decomposition and domain updates on rectilinear grids. It requires the module mpp.F90, upon which it is built.
Scalable implementations of finite-difference codes are generally based on decomposing the model domain into subdomains that are distributed among processors. These domains will then be obliged to exchange data at their boundaries if data dependencies are merely nearest-neighbour, or may need to acquire information from the global domain if there are extended data dependencies, as in the spectral transform. The domain decomposition is a key operation in the development of parallel codes.

mpp_domains_mod provides a domain decomposition and domain update API for rectilinear grids, built on top of the mpp_mod API for message passing. Features of mpp_domains_mod include:

Simple, minimal API, with free access to underlying API for more complicated stuff.

Design toward typical use in climate/weather CFD codes.

[Domains]
It is assumed that domain decomposition will mainly be in 2 horizontal dimensions, which will in general be the two fastest-varying indices. There is a separate implementation of 1D decomposition on the fastest-varying index, and 1D decomposition on the second index, treated as a special case of 2D decomposition, is also possible. We define domain as the grid associated with a task. We define the compute domain as the set of gridpoints that are computed by a task, and the data domain as the set of points that are required by the task for the calculation. There can in general be more than 1 task per PE, though often the number of domains is the same as the processor count. We define the global domain as the global computational domain of the entire model (i.e, the same as the computational domain if run on a single processor). 2D domains are defined using a derived type domain2D, constructed as follows (see comments in code for more details).
 type, public :: domain_axis_spec
   private
   integer :: begin, end, size, max_size
   logical :: is_global
 end type domain_axis_spec

 type, public :: domain1D
   private
   type(domain_axis_spec) :: compute, data, global, active
   logical :: mustputb, mustgetb, mustputf, mustgetf, folded
   type(domain1D), pointer, dimension(:) :: list
   integer :: pe  ! pe to which the domain is assigned
   integer :: pos
 end type domain1D

 type, public :: domain2D
   private
   type(domain1D) :: x
   type(domain1D) :: y
   type(domain2D), pointer, dimension(:) :: list
   integer :: pe ! PE to which this domain is assigned
   integer :: pos
 end type domain2D

 type(domain1D), public :: NULL_DOMAIN1D
 type(domain2D), public :: NULL_DOMAIN2D

Data Type Documentation

◆ mpp_domains_mod::check_data_size

interface mpp_domains_mod::check_data_size

Private interface for internal usage, compares two sizes.

Definition at line 2335 of file mpp_domains.F90.

Public Member Functions

 check_data_size_1d
 
 check_data_size_2d
 

◆ mpp_domains_mod::contact_type

type mpp_domains_mod::contact_type

Type used to represent the contact between tiles.

Note
This type will only be used in mpp_domains_define.inc

Definition at line 417 of file mpp_domains.F90.

Collaboration diagram for contact_type:
[legend]

Public Attributes

integer, dimension(:), pointer align1 =>NULL()
 
integer, dimension(:), pointer align2 =>NULL()
 alignment of me and neighbor
 
integer, dimension(:), pointer ie1 =>NULL()
 i-index of current tile repsenting contact
 
integer, dimension(:), pointer ie2 =>NULL()
 i-index of neighbor tile repsenting contact
 
integer, dimension(:), pointer is1 =>NULL()
 
integer, dimension(:), pointer is2 =>NULL()
 
integer, dimension(:), pointer je1 =>NULL()
 j-index of current tile repsenting contact
 
integer, dimension(:), pointer je2 =>NULL()
 j-index of neighbor tile repsenting contact
 
integer, dimension(:), pointer js1 =>NULL()
 
integer, dimension(:), pointer js2 =>NULL()
 
real, dimension(:), pointer refine1 =>NULL()
 
real, dimension(:), pointer refine2 =>NULL()
 
integer, dimension(:), pointer tile =>NULL()
 neighbor tile
 

Private Attributes

integer ncontact
 number of neighbor tile.
 

Member Data Documentation

◆ align1

integer, dimension(:), pointer align1 =>NULL()

Definition at line 421 of file mpp_domains.F90.

◆ align2

integer, dimension(:), pointer align2 =>NULL()

alignment of me and neighbor

Definition at line 421 of file mpp_domains.F90.

◆ ie1

integer, dimension(:), pointer ie1 =>NULL()

i-index of current tile repsenting contact

Definition at line 423 of file mpp_domains.F90.

◆ ie2

integer, dimension(:), pointer ie2 =>NULL()

i-index of neighbor tile repsenting contact

Definition at line 425 of file mpp_domains.F90.

◆ is1

integer, dimension(:), pointer is1 =>NULL()

Definition at line 423 of file mpp_domains.F90.

◆ is2

integer, dimension(:), pointer is2 =>NULL()

Definition at line 425 of file mpp_domains.F90.

◆ je1

integer, dimension(:), pointer je1 =>NULL()

j-index of current tile repsenting contact

Definition at line 424 of file mpp_domains.F90.

◆ je2

integer, dimension(:), pointer je2 =>NULL()

j-index of neighbor tile repsenting contact

Definition at line 426 of file mpp_domains.F90.

◆ js1

integer, dimension(:), pointer js1 =>NULL()

Definition at line 424 of file mpp_domains.F90.

◆ js2

integer, dimension(:), pointer js2 =>NULL()

Definition at line 426 of file mpp_domains.F90.

◆ ncontact

integer ncontact
private

number of neighbor tile.

Definition at line 419 of file mpp_domains.F90.

◆ refine1

real, dimension(:), pointer refine1 =>NULL()

Definition at line 422 of file mpp_domains.F90.

◆ refine2

real, dimension(:), pointer refine2 =>NULL()

Definition at line 422 of file mpp_domains.F90.

◆ tile

integer, dimension(:), pointer tile =>NULL()

neighbor tile

Definition at line 420 of file mpp_domains.F90.

◆ mpp_domains_mod::domain1d

type mpp_domains_mod::domain1d

One dimensional domain used to manage shared data access between pes.

Definition at line 631 of file mpp_domains.F90.

Collaboration diagram for domain1d:
[legend]

Public Attributes

logical cyclic
 true if domain is cyclic
 
type(domain_axis_specdomain_data
 index limits for data domain
 
type(domain_axis_specglobal
 index limits for global domain
 
integer goffset
 needed for global sum
 
type(domain1d), dimension(:), pointer list =>NULL()
 list of each pe's domains
 
integer loffset
 needed for global sum
 
type(domain_axis_specmemory
 index limits for memory domain
 
integer pe
 PE to which this domain is assigned.
 
integer pos
 position of this PE within link list, i.e domainlist(pos)pe = pe
 

Private Attributes

type(domain_axis_speccompute
 index limits for compute domain
 

Member Data Documentation

◆ compute

type(domain_axis_spec) compute
private

index limits for compute domain

Definition at line 633 of file mpp_domains.F90.

◆ cyclic

logical cyclic

true if domain is cyclic

Definition at line 637 of file mpp_domains.F90.

◆ domain_data

type(domain_axis_spec) domain_data

index limits for data domain

Definition at line 634 of file mpp_domains.F90.

◆ global

type(domain_axis_spec) global

index limits for global domain

Definition at line 635 of file mpp_domains.F90.

◆ goffset

integer goffset

needed for global sum

Definition at line 641 of file mpp_domains.F90.

◆ list

type(domain1d), dimension(:), pointer list =>NULL()

list of each pe's domains

Definition at line 638 of file mpp_domains.F90.

◆ loffset

integer loffset

needed for global sum

Definition at line 642 of file mpp_domains.F90.

◆ memory

type(domain_axis_spec) memory

index limits for memory domain

Definition at line 636 of file mpp_domains.F90.

◆ pe

integer pe

PE to which this domain is assigned.

Definition at line 639 of file mpp_domains.F90.

◆ pos

integer pos

position of this PE within link list, i.e domainlist(pos)pe = pe

Definition at line 640 of file mpp_domains.F90.

◆ mpp_domains_mod::domain1d_spec

type mpp_domains_mod::domain1d_spec

A private type used to specify index limits for a domain decomposition.

Definition at line 298 of file mpp_domains.F90.

Collaboration diagram for domain1d_spec:
[legend]

Public Attributes

type(domain_axis_specglobal
 
integer pos
 

Private Attributes

type(domain_axis_speccompute
 

Member Data Documentation

◆ compute

type(domain_axis_spec) compute
private

Definition at line 300 of file mpp_domains.F90.

◆ global

type(domain_axis_spec) global

Definition at line 301 of file mpp_domains.F90.

◆ pos

integer pos

Definition at line 302 of file mpp_domains.F90.

◆ mpp_domains_mod::domain2d

type mpp_domains_mod::domain2d

The domain2D type contains all the necessary information to define the global, compute and data domains of each task, as well as the PE associated with the task. The PEs from which remote data may be acquired to update the data domain are also contained in a linked list of neighbours.

Domain types of higher rank can be constructed from type domain1D typically we only need 1 and 2D, but could need higher (e.g 3D LES) some elements are repeated below if they are needed once per domain, not once per axis

Definition at line 367 of file mpp_domains.F90.

Collaboration diagram for domain2d:
[legend]

Public Attributes

type(overlapspec), pointer bound_c => NULL()
 send information for getting boundary value for symmetry domain.
 
type(overlapspec), pointer bound_e => NULL()
 send information for getting boundary value for symmetry domain.
 
type(overlapspec), pointer bound_n => NULL()
 send information for getting boundary value for symmetry domain.
 
type(overlapspec), pointer check_c => NULL()
 send and recv information for boundary consistency check of C-cell
 
type(overlapspec), pointer check_e => NULL()
 send and recv information for boundary consistency check of E-cell
 
type(overlapspec), pointer check_n => NULL()
 send and recv information for boundary consistency check of N-cell
 
integer comm_id
 MPI communicator for the mosaic.
 
integer ehalo
 halo size in x-direction
 
integer fold
 
integer(i8_kind) id
 
logical initialized =.FALSE.
 indicate if the overlapping is computed or not.
 
type(domain2d), pointer io_domain => NULL()
 domain for IO, will be set through calling mpp_set_io_domain ( this will be changed).
 
integer, dimension(2) io_layout
 io_layout, will be set through mpp_define_io_domain default = domain layout
 
type(domain2d_spec), dimension(:), pointer list => NULL()
 domain decomposition on pe list
 
integer max_ntile_pe
 maximum value in the pelist of number of tiles on each pe.
 
integer ncontacts
 number of contact region within mosaic.
 
integer nhalo
 halo size in y-direction
 
integer ntiles
 number of tiles within mosaic
 
integer pe
 PE to which this domain is assigned.
 
integer, dimension(:,:), pointer pearray => NULL()
 pe of each layout position
 
integer pos
 position of this PE within link list
 
logical rotated_ninety
 indicate if any contact rotate NINETY or MINUS_NINETY
 
integer shalo
 
logical symmetry
 indicate the domain is symmetric or non-symmetric.
 
integer tile_comm_id
 MPI communicator for this tile of domain.
 
integer, dimension(:), pointer tile_id => NULL()
 tile id of each tile on current processor
 
integer, dimension(:), pointer tile_id_all => NULL()
 tile id of all the tiles of domain
 
integer tile_root_pe
 root pe of current tile.
 
type(tile_type), dimension(:), pointer tilelist => NULL()
 store tile information
 
type(overlapspec), pointer update_c => NULL()
 send and recv information for halo update of C-cell.
 
type(overlapspec), pointer update_e => NULL()
 send and recv information for halo update of E-cell.
 
type(overlapspec), pointer update_n => NULL()
 send and recv information for halo update of N-cell.
 
type(overlapspec), pointer update_t => NULL()
 send and recv information for halo update of T-cell.
 
integer whalo
 
type(domain1d), dimension(:), pointer x => NULL()
 x-direction domain decomposition
 
type(domain1d), dimension(:), pointer y => NULL()
 y-direction domain decomposition
 

Private Attributes

character(len=name_length) name ='unnamed'
 name of the domain, default is "unspecified"
 

Member Data Documentation

◆ bound_c

type(overlapspec), pointer bound_c => NULL()

send information for getting boundary value for symmetry domain.

Definition at line 400 of file mpp_domains.F90.

◆ bound_e

type(overlapspec), pointer bound_e => NULL()

send information for getting boundary value for symmetry domain.

Definition at line 402 of file mpp_domains.F90.

◆ bound_n

type(overlapspec), pointer bound_n => NULL()

send information for getting boundary value for symmetry domain.

Definition at line 404 of file mpp_domains.F90.

◆ check_c

type(overlapspec), pointer check_c => NULL()

send and recv information for boundary consistency check of C-cell

Definition at line 394 of file mpp_domains.F90.

◆ check_e

type(overlapspec), pointer check_e => NULL()

send and recv information for boundary consistency check of E-cell

Definition at line 396 of file mpp_domains.F90.

◆ check_n

type(overlapspec), pointer check_n => NULL()

send and recv information for boundary consistency check of N-cell

Definition at line 398 of file mpp_domains.F90.

◆ comm_id

integer comm_id

MPI communicator for the mosaic.

Definition at line 378 of file mpp_domains.F90.

◆ ehalo

integer ehalo

halo size in x-direction

Definition at line 375 of file mpp_domains.F90.

◆ fold

integer fold

Definition at line 372 of file mpp_domains.F90.

◆ id

integer(i8_kind) id

Definition at line 370 of file mpp_domains.F90.

◆ initialized

logical initialized =.FALSE.

indicate if the overlapping is computed or not.

Definition at line 383 of file mpp_domains.F90.

◆ io_domain

type(domain2d), pointer io_domain => NULL()

domain for IO, will be set through calling mpp_set_io_domain ( this will be changed).

Definition at line 410 of file mpp_domains.F90.

◆ io_layout

integer, dimension(2) io_layout

io_layout, will be set through mpp_define_io_domain default = domain layout

Definition at line 385 of file mpp_domains.F90.

◆ list

type(domain2d_spec), dimension(:), pointer list => NULL()

domain decomposition on pe list

Definition at line 392 of file mpp_domains.F90.

◆ max_ntile_pe

integer max_ntile_pe

maximum value in the pelist of number of tiles on each pe.

Definition at line 380 of file mpp_domains.F90.

◆ name

character(len=name_length) name ='unnamed'
private

name of the domain, default is "unspecified"

Definition at line 369 of file mpp_domains.F90.

◆ ncontacts

integer ncontacts

number of contact region within mosaic.

Definition at line 381 of file mpp_domains.F90.

◆ nhalo

integer nhalo

halo size in y-direction

Definition at line 376 of file mpp_domains.F90.

◆ ntiles

integer ntiles

number of tiles within mosaic

Definition at line 377 of file mpp_domains.F90.

◆ pe

integer pe

PE to which this domain is assigned.

Definition at line 371 of file mpp_domains.F90.

◆ pearray

integer, dimension(:,:), pointer pearray => NULL()

pe of each layout position

Definition at line 387 of file mpp_domains.F90.

◆ pos

integer pos

position of this PE within link list

Definition at line 373 of file mpp_domains.F90.

◆ rotated_ninety

logical rotated_ninety

indicate if any contact rotate NINETY or MINUS_NINETY

Definition at line 382 of file mpp_domains.F90.

◆ shalo

integer shalo

Definition at line 376 of file mpp_domains.F90.

◆ symmetry

logical symmetry

indicate the domain is symmetric or non-symmetric.

Definition at line 374 of file mpp_domains.F90.

◆ tile_comm_id

integer tile_comm_id

MPI communicator for this tile of domain.

Definition at line 379 of file mpp_domains.F90.

◆ tile_id

integer, dimension(:), pointer tile_id => NULL()

tile id of each tile on current processor

Definition at line 388 of file mpp_domains.F90.

◆ tile_id_all

integer, dimension(:), pointer tile_id_all => NULL()

tile id of all the tiles of domain

Definition at line 389 of file mpp_domains.F90.

◆ tile_root_pe

integer tile_root_pe

root pe of current tile.

Definition at line 384 of file mpp_domains.F90.

◆ tilelist

type(tile_type), dimension(:), pointer tilelist => NULL()

store tile information

Definition at line 393 of file mpp_domains.F90.

◆ update_c

type(overlapspec), pointer update_c => NULL()

send and recv information for halo update of C-cell.

Definition at line 408 of file mpp_domains.F90.

◆ update_e

type(overlapspec), pointer update_e => NULL()

send and recv information for halo update of E-cell.

Definition at line 407 of file mpp_domains.F90.

◆ update_n

type(overlapspec), pointer update_n => NULL()

send and recv information for halo update of N-cell.

Definition at line 409 of file mpp_domains.F90.

◆ update_t

type(overlapspec), pointer update_t => NULL()

send and recv information for halo update of T-cell.

Definition at line 406 of file mpp_domains.F90.

◆ whalo

integer whalo

Definition at line 375 of file mpp_domains.F90.

◆ x

type(domain1d), dimension(:), pointer x => NULL()

x-direction domain decomposition

Definition at line 390 of file mpp_domains.F90.

◆ y

type(domain1d), dimension(:), pointer y => NULL()

y-direction domain decomposition

Definition at line 391 of file mpp_domains.F90.

◆ mpp_domains_mod::domain2d_spec

type mpp_domains_mod::domain2d_spec

Private type to specify multiple index limits and pe information for a 2D domain.

Definition at line 307 of file mpp_domains.F90.

Collaboration diagram for domain2d_spec:
[legend]

Public Attributes

integer pe
 PE to which this domain is assigned.
 
integer pos
 position of this PE within link list
 
integer, dimension(:), pointer tile_id => NULL()
 tile id of each tile
 
integer tile_root_pe
 root pe of tile.
 
type(domain1d_spec), dimension(:), pointer y => NULL()
 y-direction domain decomposition
 

Private Attributes

type(domain1d_spec), dimension(:), pointer x => NULL()
 x-direction domain decomposition
 

Member Data Documentation

◆ pe

integer pe

PE to which this domain is assigned.

Definition at line 312 of file mpp_domains.F90.

◆ pos

integer pos

position of this PE within link list

Definition at line 313 of file mpp_domains.F90.

◆ tile_id

integer, dimension(:), pointer tile_id => NULL()

tile id of each tile

Definition at line 311 of file mpp_domains.F90.

◆ tile_root_pe

integer tile_root_pe

root pe of tile.

Definition at line 314 of file mpp_domains.F90.

◆ x

type(domain1d_spec), dimension(:), pointer x => NULL()
private

x-direction domain decomposition

Definition at line 309 of file mpp_domains.F90.

◆ y

type(domain1d_spec), dimension(:), pointer y => NULL()

y-direction domain decomposition

Definition at line 310 of file mpp_domains.F90.

◆ mpp_domains_mod::domain_axis_spec

type mpp_domains_mod::domain_axis_spec

Used to specify index limits along an axis of a domain.

Definition at line 287 of file mpp_domains.F90.

Collaboration diagram for domain_axis_spec:
[legend]

Public Attributes

integer end
 end of domain axis
 
logical is_global
 .true. if domain axis extent covers global domain
 
integer max_size
 max size in set
 
integer size
 size of domain axis
 

Private Attributes

integer begin
 start of domain axis
 

Member Data Documentation

◆ begin

integer begin
private

start of domain axis

Definition at line 289 of file mpp_domains.F90.

◆ end

integer end

end of domain axis

Definition at line 290 of file mpp_domains.F90.

◆ is_global

logical is_global

.true. if domain axis extent covers global domain

Definition at line 293 of file mpp_domains.F90.

◆ max_size

integer max_size

max size in set

Definition at line 292 of file mpp_domains.F90.

◆ size

integer size

size of domain axis

Definition at line 291 of file mpp_domains.F90.

◆ mpp_domains_mod::domaincommunicator2d

type mpp_domains_mod::domaincommunicator2d

Used for sending domain data between pe's.

Definition at line 498 of file mpp_domains.F90.

Collaboration diagram for domaincommunicator2d:
[legend]

Public Attributes

integer, dimension(:), allocatable cfrom_pe
 
integer, dimension(:), allocatable cto_pe
 
type(domain2d), pointer domain =>NULL()
 
type(domain2d), pointer domain_in =>NULL()
 
type(domain2d), pointer domain_out =>NULL()
 
integer gf_ioff =0
 
integer gf_joff =0
 
integer(i8_kind) id =-9999
 
integer isize =0
 
integer isize_in =0
 
integer isize_max =0
 
integer isize_out =0
 
integer, dimension(:), allocatable isizer
 
integer jsize =0
 
integer jsize_in =0
 
integer jsize_max =0
 
integer jsize_out =0
 
integer, dimension(:), allocatable jsizer
 
integer ke =0
 
integer(i8_kind) l_addr =-9999
 
integer(i8_kind) l_addrx =-9999
 
integer(i8_kind) l_addry =-9999
 
integer position
 data location. T, E, C, or N.
 
logical, dimension(:), allocatable r_do_buf
 
integer, dimension(:), allocatable r_msize
 
type(overlapspec), dimension(:,:,:,:), pointer recv => NULL()
 
integer, dimension(:,:), allocatable recvie
 
integer, dimension(:,:), allocatable recvis
 
integer, dimension(:,:), allocatable recvje
 
integer, dimension(:,:), allocatable recvjs
 
integer(i8_kind), dimension(:), allocatable rem_addr
 
integer(i8_kind), dimension(:,:), allocatable rem_addrl
 
integer(i8_kind), dimension(:,:), allocatable rem_addrlx
 
integer(i8_kind), dimension(:,:), allocatable rem_addrly
 
integer(i8_kind), dimension(:), allocatable rem_addrx
 
integer(i8_kind), dimension(:), allocatable rem_addry
 
integer rlist_size =0
 
logical, dimension(:), allocatable s_do_buf
 
integer, dimension(:), allocatable s_msize
 
type(overlapspec), dimension(:,:,:,:), pointer send => NULL()
 
integer, dimension(:,:), allocatable sendie
 
integer, dimension(:,:), allocatable sendis
 
integer, dimension(:,:), allocatable sendisr
 
integer, dimension(:,:), allocatable sendje
 
integer, dimension(:,:), allocatable sendjs
 
integer, dimension(:,:), allocatable sendjsr
 
integer slist_size =0
 

Private Attributes

logical initialized =.false.
 

Member Data Documentation

◆ cfrom_pe

integer, dimension(:), allocatable cfrom_pe

Definition at line 521 of file mpp_domains.F90.

◆ cto_pe

integer, dimension(:), allocatable cto_pe

Definition at line 520 of file mpp_domains.F90.

◆ domain

type(domain2d), pointer domain =>NULL()

Definition at line 505 of file mpp_domains.F90.

◆ domain_in

type(domain2d), pointer domain_in =>NULL()

Definition at line 506 of file mpp_domains.F90.

◆ domain_out

type(domain2d), pointer domain_out =>NULL()

Definition at line 507 of file mpp_domains.F90.

◆ gf_ioff

integer gf_ioff =0

Definition at line 529 of file mpp_domains.F90.

◆ gf_joff

integer gf_joff =0

Definition at line 529 of file mpp_domains.F90.

◆ id

integer(i8_kind) id =-9999

Definition at line 501 of file mpp_domains.F90.

◆ initialized

logical initialized =.false.
private

Definition at line 500 of file mpp_domains.F90.

◆ isize

integer isize =0

Definition at line 525 of file mpp_domains.F90.

◆ isize_in

integer isize_in =0

Definition at line 526 of file mpp_domains.F90.

◆ isize_max

integer isize_max =0

Definition at line 528 of file mpp_domains.F90.

◆ isize_out

integer isize_out =0

Definition at line 527 of file mpp_domains.F90.

◆ isizer

integer, dimension(:), allocatable isizer

Definition at line 531 of file mpp_domains.F90.

◆ jsize

integer jsize =0

Definition at line 525 of file mpp_domains.F90.

◆ jsize_in

integer jsize_in =0

Definition at line 526 of file mpp_domains.F90.

◆ jsize_max

integer jsize_max =0

Definition at line 528 of file mpp_domains.F90.

◆ jsize_out

integer jsize_out =0

Definition at line 527 of file mpp_domains.F90.

◆ jsizer

integer, dimension(:), allocatable jsizer

Definition at line 532 of file mpp_domains.F90.

◆ ke

integer ke =0

Definition at line 525 of file mpp_domains.F90.

◆ l_addr

integer(i8_kind) l_addr =-9999

Definition at line 502 of file mpp_domains.F90.

◆ l_addrx

integer(i8_kind) l_addrx =-9999

Definition at line 503 of file mpp_domains.F90.

◆ l_addry

integer(i8_kind) l_addry =-9999

Definition at line 504 of file mpp_domains.F90.

◆ position

integer position

data location. T, E, C, or N.

Definition at line 541 of file mpp_domains.F90.

◆ r_do_buf

logical, dimension(:), allocatable r_do_buf

Definition at line 519 of file mpp_domains.F90.

◆ r_msize

integer, dimension(:), allocatable r_msize

Definition at line 523 of file mpp_domains.F90.

◆ recv

type(overlapspec), dimension(:,:,:,:), pointer recv => NULL()

Definition at line 509 of file mpp_domains.F90.

◆ recvie

integer, dimension(:,:), allocatable recvie

Definition at line 515 of file mpp_domains.F90.

◆ recvis

integer, dimension(:,:), allocatable recvis

Definition at line 514 of file mpp_domains.F90.

◆ recvje

integer, dimension(:,:), allocatable recvje

Definition at line 517 of file mpp_domains.F90.

◆ recvjs

integer, dimension(:,:), allocatable recvjs

Definition at line 516 of file mpp_domains.F90.

◆ rem_addr

integer(i8_kind), dimension(:), allocatable rem_addr

Definition at line 535 of file mpp_domains.F90.

◆ rem_addrl

integer(i8_kind), dimension(:,:), allocatable rem_addrl

Definition at line 538 of file mpp_domains.F90.

◆ rem_addrlx

integer(i8_kind), dimension(:,:), allocatable rem_addrlx

Definition at line 539 of file mpp_domains.F90.

◆ rem_addrly

integer(i8_kind), dimension(:,:), allocatable rem_addrly

Definition at line 540 of file mpp_domains.F90.

◆ rem_addrx

integer(i8_kind), dimension(:), allocatable rem_addrx

Definition at line 536 of file mpp_domains.F90.

◆ rem_addry

integer(i8_kind), dimension(:), allocatable rem_addry

Definition at line 537 of file mpp_domains.F90.

◆ rlist_size

integer rlist_size =0

Definition at line 524 of file mpp_domains.F90.

◆ s_do_buf

logical, dimension(:), allocatable s_do_buf

Definition at line 518 of file mpp_domains.F90.

◆ s_msize

integer, dimension(:), allocatable s_msize

Definition at line 522 of file mpp_domains.F90.

◆ send

type(overlapspec), dimension(:,:,:,:), pointer send => NULL()

Definition at line 508 of file mpp_domains.F90.

◆ sendie

integer, dimension(:,:), allocatable sendie

Definition at line 511 of file mpp_domains.F90.

◆ sendis

integer, dimension(:,:), allocatable sendis

Definition at line 510 of file mpp_domains.F90.

◆ sendisr

integer, dimension(:,:), allocatable sendisr

Definition at line 533 of file mpp_domains.F90.

◆ sendje

integer, dimension(:,:), allocatable sendje

Definition at line 513 of file mpp_domains.F90.

◆ sendjs

integer, dimension(:,:), allocatable sendjs

Definition at line 512 of file mpp_domains.F90.

◆ sendjsr

integer, dimension(:,:), allocatable sendjsr

Definition at line 534 of file mpp_domains.F90.

◆ slist_size

integer slist_size =0

Definition at line 524 of file mpp_domains.F90.

◆ mpp_domains_mod::domainug

type mpp_domains_mod::domainug

Domain information for managing data on unstructured grids.

Definition at line 266 of file mpp_domains.F90.

Collaboration diagram for domainug:
[legend]

Public Attributes

type(unstruct_axis_specglobal
 axis specifications
 
integer, dimension(:), pointer grid_index => NULL()
 index of grid on current pe
 
type(domainug), pointer io_domain =>NULL()
 
integer(i4_kind) io_layout
 
type(unstruct_domain_spec), dimension(:), pointer list =>NULL()
 
integer npes_io_group
 
integer ntiles
 
integer pe
 
integer pos
 
type(unstruct_pass_typesg2ug
 
type(domain2d), pointer sg_domain => NULL()
 
integer tile_id
 
integer tile_npes
 
integer tile_root_pe
 
type(unstruct_pass_typeug2sg
 

Private Attributes

type(unstruct_axis_speccompute
 

Member Data Documentation

◆ compute

type(unstruct_axis_spec) compute
private

Definition at line 268 of file mpp_domains.F90.

◆ global

type(unstruct_axis_spec) global

axis specifications

Definition at line 268 of file mpp_domains.F90.

◆ grid_index

integer, dimension(:), pointer grid_index => NULL()

index of grid on current pe

Definition at line 273 of file mpp_domains.F90.

◆ io_domain

type(domainug), pointer io_domain =>NULL()

Definition at line 270 of file mpp_domains.F90.

◆ io_layout

integer(i4_kind) io_layout

Definition at line 282 of file mpp_domains.F90.

◆ list

type(unstruct_domain_spec), dimension(:), pointer list =>NULL()

Definition at line 269 of file mpp_domains.F90.

◆ npes_io_group

integer npes_io_group

Definition at line 281 of file mpp_domains.F90.

◆ ntiles

integer ntiles

Definition at line 277 of file mpp_domains.F90.

◆ pe

integer pe

Definition at line 275 of file mpp_domains.F90.

◆ pos

integer pos

Definition at line 276 of file mpp_domains.F90.

◆ sg2ug

type(unstruct_pass_type) sg2ug

Definition at line 271 of file mpp_domains.F90.

◆ sg_domain

type(domain2d), pointer sg_domain => NULL()

Definition at line 274 of file mpp_domains.F90.

◆ tile_id

integer tile_id

Definition at line 278 of file mpp_domains.F90.

◆ tile_npes

integer tile_npes

Definition at line 280 of file mpp_domains.F90.

◆ tile_root_pe

integer tile_root_pe

Definition at line 279 of file mpp_domains.F90.

◆ ug2sg

type(unstruct_pass_type) ug2sg

Definition at line 272 of file mpp_domains.F90.

◆ mpp_domains_mod::index_type

type mpp_domains_mod::index_type

index bounds for use in nestSpec

Definition at line 431 of file mpp_domains.F90.

Collaboration diagram for index_type:
[legend]

Public Attributes

integer ie_me
 
integer ie_you
 
integer is_me
 
integer is_you
 
integer je_me
 
integer je_you
 
integer js_me
 
integer js_you
 

Member Data Documentation

◆ ie_me

integer ie_me

Definition at line 432 of file mpp_domains.F90.

◆ ie_you

integer ie_you

Definition at line 433 of file mpp_domains.F90.

◆ is_me

integer is_me

Definition at line 432 of file mpp_domains.F90.

◆ is_you

integer is_you

Definition at line 433 of file mpp_domains.F90.

◆ je_me

integer je_me

Definition at line 432 of file mpp_domains.F90.

◆ je_you

integer je_you

Definition at line 433 of file mpp_domains.F90.

◆ js_me

integer js_me

Definition at line 432 of file mpp_domains.F90.

◆ js_you

integer js_you

Definition at line 433 of file mpp_domains.F90.

◆ mpp_domains_mod::mpp_broadcast_domain

interface mpp_domains_mod::mpp_broadcast_domain

Broadcasts domain to every pe. Only useful outside the context of it's own pelist.


Example usage: call mpp_broadcast_domain(domain) call mpp_broadcast_domain(domain_in, domain_out) call mpp_broadcast_domain(domain, tile_coarse) ! nested domains

Definition at line 1505 of file mpp_domains.F90.

Public Member Functions

 mpp_broadcast_domain_1
 
 mpp_broadcast_domain_2
 
 mpp_broadcast_domain_nest_coarse
 
 mpp_broadcast_domain_nest_fine
 
 mpp_broadcast_domain_ug
 

◆ mpp_domains_mod::mpp_check_field

interface mpp_domains_mod::mpp_check_field

Parallel checking between two ensembles which run on different set pes at the same time
There are two forms for the mpp_check_field call. The 2D version is generally to be used and 3D version is built by repeated calls to the 2D version.

Example usage:

call mpp_check_field(field_in, pelist1, pelist2, domain, mesg, &
w_halo, s_halo, e_halo, n_halo, force_abort )
Parameters
field_inField to be checked
domainDomain of current pe
mesgMessage to be printed out
w_haloHalo size to be checked, default is 0
s_haloHalo size to be checked, default is 0
e_haloHalo size to be checked, default is 0
n_haloHalo size to be checked, default is 0
force_abortWhen true, abort program when any difference found. Default is false.

Definition at line 1751 of file mpp_domains.F90.

Public Member Functions

 mpp_check_field_2d
 
 mpp_check_field_3d
 

◆ mpp_domains_mod::mpp_complete_do_update

interface mpp_domains_mod::mpp_complete_do_update

Private interface used for non blocking updates.

Definition at line 1294 of file mpp_domains.F90.

Public Member Functions

 mpp_complete_do_update_i4_3d
 
 mpp_complete_do_update_i8_3d
 
 mpp_complete_do_update_r4_3d
 
 mpp_complete_do_update_r4_3dv
 
 mpp_complete_do_update_r8_3d
 
 mpp_complete_do_update_r8_3dv
 

◆ mpp_domains_mod::mpp_complete_group_update

interface mpp_domains_mod::mpp_complete_group_update

Completes a pending non-blocking group update Must follow a call to mpp_start_group_update.

Parameters
[in,out]type(mpp_group_update_type)group
[in,out]type(domain2D)domain
[in]d_typedata type

Definition at line 1354 of file mpp_domains.F90.

Public Member Functions

 mpp_complete_group_update_r4
 
 mpp_complete_group_update_r8
 

◆ mpp_domains_mod::mpp_complete_update_domains

interface mpp_domains_mod::mpp_complete_update_domains

Must be used after a call to mpp_start_update_domains in order to complete a nonblocking domain update. See mpp_start_update_domains for more info.

Definition at line 1236 of file mpp_domains.F90.

Public Member Functions

 mpp_complete_update_domain2d_i4_2d
 
 mpp_complete_update_domain2d_i4_3d
 
 mpp_complete_update_domain2d_i4_4d
 
 mpp_complete_update_domain2d_i4_5d
 
 mpp_complete_update_domain2d_i8_2d
 
 mpp_complete_update_domain2d_i8_3d
 
 mpp_complete_update_domain2d_i8_4d
 
 mpp_complete_update_domain2d_i8_5d
 
 mpp_complete_update_domain2d_r4_2d
 
 mpp_complete_update_domain2d_r4_2dv
 
 mpp_complete_update_domain2d_r4_3d
 
 mpp_complete_update_domain2d_r4_3dv
 
 mpp_complete_update_domain2d_r4_4d
 
 mpp_complete_update_domain2d_r4_4dv
 
 mpp_complete_update_domain2d_r4_5d
 
 mpp_complete_update_domain2d_r4_5dv
 
 mpp_complete_update_domain2d_r8_2d
 
 mpp_complete_update_domain2d_r8_2dv
 
 mpp_complete_update_domain2d_r8_3d
 
 mpp_complete_update_domain2d_r8_3dv
 
 mpp_complete_update_domain2d_r8_4d
 
 mpp_complete_update_domain2d_r8_4dv
 
 mpp_complete_update_domain2d_r8_5d
 
 mpp_complete_update_domain2d_r8_5dv
 

◆ mpp_domains_mod::mpp_copy_domain

interface mpp_domains_mod::mpp_copy_domain

Copy 1D or 2D domain.

Parameters
domain_inInput domain to get read
domain_outOutput domain to get written to

Definition at line 912 of file mpp_domains.F90.

Public Member Functions

 mpp_copy_domain1d
 
 mpp_copy_domain2d
 

◆ mpp_domains_mod::mpp_create_group_update

interface mpp_domains_mod::mpp_create_group_update

Constructor for the mpp_group_update_type which is then used with mpp_start_group_update.

Parameters

Definition at line 1314 of file mpp_domains.F90.

Public Member Functions

 mpp_create_group_update_r4_2d
 
 mpp_create_group_update_r4_2dv
 
 mpp_create_group_update_r4_3d
 
 mpp_create_group_update_r4_3dv
 
 mpp_create_group_update_r4_4d
 
 mpp_create_group_update_r4_4dv
 
 mpp_create_group_update_r8_2d
 
 mpp_create_group_update_r8_2dv
 
 mpp_create_group_update_r8_3d
 
 mpp_create_group_update_r8_3dv
 
 mpp_create_group_update_r8_4d
 
 mpp_create_group_update_r8_4dv
 

◆ mpp_domains_mod::mpp_deallocate_domain

interface mpp_domains_mod::mpp_deallocate_domain

Deallocate given 1D or 2D domain.

Parameters
domainan allocated domain1D or domain2D

Definition at line 919 of file mpp_domains.F90.

Public Member Functions

 mpp_deallocate_domain1d
 
 mpp_deallocate_domain2d
 

◆ mpp_domains_mod::mpp_define_domains

interface mpp_domains_mod::mpp_define_domains

Set up a domain decomposition.

There are two forms for the mpp_define_domains call. The 2D version is generally to be used but is built by repeated calls to the 1D version, also provided.


Example usage:

               call mpp_define_domains( global_indices, ndivs, domain, &
                              pelist, flags, halo, extent, maskmap )
               call mpp_define_domains( global_indices, layout, domain, pelist, &
                              xflags, yflags, xhalo, yhalo,           &
                              xextent, yextent, maskmap, name )
Parameters
global_indicesDefines the global domain.
ndivsThe number of domain divisions required.
[in,out]domainHolds the resulting domain decomposition.
pelistList of PEs to which the domains are to be assigned.
flagsAn optional flag to pass additional information about the desired domain topology. Useful flags in a 1D decomposition include GLOBAL_DATA_DOMAIN and CYCLIC_GLOBAL_DOMAIN. Flags are integers: multiple flags may be added together. The flag values are public parameters available by use association.
haloWidth of the halo.
extentNormally mpp_define_domains attempts an even division of the global domain across ndivs domains. The extent array can be used by the user to pass a custom domain division. The extent array has ndivs elements and holds the compute domain widths, which should add up to cover the global domain exactly.
maskmapSome divisions may be masked (maskmap=.FALSE.) to exclude them from the computation (e.g for ocean model domains that are all land). The maskmap array is dimensioned ndivs and contains .TRUE. values for any domain that must be included in the computation (default all). The pelist array length should match the number of domains included in the computation.


Example usage:

call mpp_define_domains( (/1,100/), 10, domain, &
flags=global_data_domain+cyclic_global_domain, halo=2 )

defines 10 compute domains spanning the range [1,100] of the global domain. The compute domains are non-overlapping blocks of 10. All the data domains are global, and with a halo of 2 span the range [-1:102]. And since the global domain has been declared to be cyclic, domain(9)next => domain(0) and domain(0)prev => domain(9). A field is allocated on the data domain, and computations proceed on the compute domain. A call to mpp_update_domains would fill in the values in the halo region:

call mpp_get_data_domain( domain, isd, ied ) !returns -1 and 102
call mpp_get_compute_domain( domain, is, ie ) !returns (1,10) on PE 0 ...
allocate( a(isd:ied) )
do i = is,ie
a(i) = <perform computations>
end do
call mpp_update_domains( a, domain )


The call to mpp_update_domainsfills in the regions outside the compute domain. Since the global domain is cyclic, the values at i=(-1,0) are the same as at i=(99,100); and i=(101,102) are the same as i=(1,2).

The 2D version is just an extension of this syntax to two dimensions.

The 2D version of the above should generally be used in codes, including 1D-decomposed ones, if there is a possibility of future evolution toward 2D decomposition. The arguments are similar to the 1D case, except that now we have optional arguments flags, halo, extent and maskmap along two axes.

flags can now take an additional possible value to fold one or more edges. This is done by using flags FOLD_WEST_EDGE, FOLD_EAST_EDGE, FOLD_SOUTH_EDGE or FOLD_NORTH_EDGE. When a fold exists (e.g cylindrical domain), vector fields reverse sign upon crossing the fold. This parity reversal is performed only in the vector version of mpp_update_domains. In addition, shift operations may need to be applied to vector fields on staggered grids, also described in the vector interface to mpp_update_domains.

name is the name associated with the decomposition, e.g 'Ocean model'. If this argument is present, mpp_define_domains will print the domain decomposition generated to stdlog.


Examples: call mpp_define_domains( (/1,100,1,100/), (/2,2/), domain, xhalo=1 ) will create the following domain layout:

domain domain(1) domain(2) domain(3) domain(4)
Compute domain 1,50,1,50 51,100,1,50 1,50,51,100 51,100,51,100
Data domain 0,51,1,50 50,101,1,50 0,51,51,100 50,101,51,100

Again, we allocate arrays on the data domain, perform computations on the compute domain, and call mpp_update_domains to update the halo region.

If we wished to perfom a 1D decomposition along Y on the same global domain, we could use:

               call mpp_define_domains( (/1,100,1,100/), layout=(/4,1/), domain, xhalo=1 )

This will create the following domain layout:

domain domain(1) domain(2) domain(3) domain(4)
Compute domain 1,100,1,25 1,100,26,50 1,100,51,75 1,100,76,100
Data domain 0,101,1,25 0,101,26,50 0,101,51,75 1,101,76,100

Definition at line 891 of file mpp_domains.F90.

Public Member Functions

 mpp_define_domains1d
 
 mpp_define_domains2d
 

◆ mpp_domains_mod::mpp_define_layout

interface mpp_domains_mod::mpp_define_layout

Retrieve layout associated with a domain decomposition. Given a global 2D domain and the number of divisions in the decomposition ndivs (usually the PE count unless some domains are masked) this calls returns a 2D domain layout. By default, mpp_define_layout will attempt to divide the 2D index space into domains that maintain the aspect ratio of the global domain. If this cannot be done, the algorithm favours domains that are longer in x than y, a preference that could improve vector performance.
Example usage:

call mpp_define_layout( global_indices, ndivs, layout )

Definition at line 773 of file mpp_domains.F90.

Public Member Functions

 mpp_define_layout2d
 

◆ mpp_domains_mod::mpp_define_null_domain

interface mpp_domains_mod::mpp_define_null_domain

Defines a nullified 1D or 2D domain.


Example usage:

call mpp_define_null_domain(domain)

Definition at line 903 of file mpp_domains.F90.

Public Member Functions

 mpp_define_null_domain1d
 
 mpp_define_null_domain2d
 

◆ mpp_domains_mod::mpp_do_check

interface mpp_domains_mod::mpp_do_check

Private interface to updates data domain of 3D field whose computational domains have been computed.

Definition at line 1555 of file mpp_domains.F90.

Public Member Functions

 mpp_do_check_c4_3d
 
 mpp_do_check_c8_3d
 
 mpp_do_check_i4_3d
 
 mpp_do_check_i8_3d
 
 mpp_do_check_r4_3d
 
 mpp_do_check_r4_3dv
 
 mpp_do_check_r8_3d
 
 mpp_do_check_r8_3dv
 

◆ mpp_domains_mod::mpp_do_get_boundary

interface mpp_domains_mod::mpp_do_get_boundary

Definition at line 1656 of file mpp_domains.F90.

Public Member Functions

 mpp_do_get_boundary_r4_3d
 
 mpp_do_get_boundary_r4_3dv
 
 mpp_do_get_boundary_r8_3d
 
 mpp_do_get_boundary_r8_3dv
 

◆ mpp_domains_mod::mpp_do_get_boundary_ad

interface mpp_domains_mod::mpp_do_get_boundary_ad

Definition at line 1664 of file mpp_domains.F90.

Public Member Functions

 mpp_do_get_boundary_ad_r4_3d
 
 mpp_do_get_boundary_ad_r4_3dv
 
 mpp_do_get_boundary_ad_r8_3d
 
 mpp_do_get_boundary_ad_r8_3dv
 

◆ mpp_domains_mod::mpp_do_global_field

interface mpp_domains_mod::mpp_do_global_field

Private helper interface used by mpp_global_field.

Definition at line 1865 of file mpp_domains.F90.

Public Member Functions

 mpp_do_global_field2d_c4_3d
 
 mpp_do_global_field2d_c8_3d
 
 mpp_do_global_field2d_i4_3d
 
 mpp_do_global_field2d_i8_3d
 
 mpp_do_global_field2d_l4_3d
 
 mpp_do_global_field2d_l8_3d
 
 mpp_do_global_field2d_r4_3d
 
 mpp_do_global_field2d_r8_3d
 

◆ mpp_domains_mod::mpp_do_global_field_ad

interface mpp_domains_mod::mpp_do_global_field_ad

Definition at line 1917 of file mpp_domains.F90.

Public Member Functions

 mpp_do_global_field2d_c4_3d_ad
 
 mpp_do_global_field2d_c8_3d_ad
 
 mpp_do_global_field2d_i4_3d_ad
 
 mpp_do_global_field2d_i8_3d_ad
 
 mpp_do_global_field2d_l4_3d_ad
 
 mpp_do_global_field2d_l8_3d_ad
 
 mpp_do_global_field2d_r4_3d_ad
 
 mpp_do_global_field2d_r8_3d_ad
 

◆ mpp_domains_mod::mpp_do_group_update

interface mpp_domains_mod::mpp_do_group_update

Definition at line 1330 of file mpp_domains.F90.

Public Member Functions

 mpp_do_group_update_r4
 
 mpp_do_group_update_r8
 

◆ mpp_domains_mod::mpp_do_redistribute

interface mpp_domains_mod::mpp_do_redistribute

Definition at line 1718 of file mpp_domains.F90.

Public Member Functions

 mpp_do_redistribute_c4_3d
 
 mpp_do_redistribute_c8_3d
 
 mpp_do_redistribute_i4_3d
 
 mpp_do_redistribute_i8_3d
 
 mpp_do_redistribute_l4_3d
 
 mpp_do_redistribute_l8_3d
 
 mpp_do_redistribute_r4_3d
 
 mpp_do_redistribute_r8_3d
 

◆ mpp_domains_mod::mpp_do_update

interface mpp_domains_mod::mpp_do_update

Private interface used for mpp_update_domains.

Definition at line 1539 of file mpp_domains.F90.

Public Member Functions

 mpp_do_update_c4_3d
 
 mpp_do_update_c8_3d
 
 mpp_do_update_i4_3d
 
 mpp_do_update_i8_3d
 
 mpp_do_update_r4_3d
 
 mpp_do_update_r4_3dv
 
 mpp_do_update_r8_3d
 
 mpp_do_update_r8_3dv
 

◆ mpp_domains_mod::mpp_do_update_ad

interface mpp_domains_mod::mpp_do_update_ad

Passes a data field from a unstructured grid to an structured grid
Example usage:

       call mpp_pass_UG_to_SG(UG_domain, field_UG, field_SG)

Definition at line 1608 of file mpp_domains.F90.

Public Member Functions

 mpp_do_update_ad_r4_3d
 
 mpp_do_update_ad_r4_3dv
 
 mpp_do_update_ad_r8_3d
 
 mpp_do_update_ad_r8_3dv
 

◆ mpp_domains_mod::mpp_do_update_nest_coarse

interface mpp_domains_mod::mpp_do_update_nest_coarse

Used by mpp_update_nest_coarse to perform domain updates.

Definition at line 1471 of file mpp_domains.F90.

Public Member Functions

 mpp_do_update_nest_coarse_i4_3d
 
 mpp_do_update_nest_coarse_i8_3d
 
 mpp_do_update_nest_coarse_r4_3d
 
 mpp_do_update_nest_coarse_r4_3dv
 
 mpp_do_update_nest_coarse_r8_3d
 
 mpp_do_update_nest_coarse_r8_3dv
 

◆ mpp_domains_mod::mpp_do_update_nest_fine

interface mpp_domains_mod::mpp_do_update_nest_fine

Definition at line 1415 of file mpp_domains.F90.

Public Member Functions

 mpp_do_update_nest_fine_i4_3d
 
 mpp_do_update_nest_fine_i8_3d
 
 mpp_do_update_nest_fine_r4_3d
 
 mpp_do_update_nest_fine_r4_3dv
 
 mpp_do_update_nest_fine_r8_3d
 
 mpp_do_update_nest_fine_r8_3dv
 

◆ mpp_domains_mod::mpp_get_boundary

interface mpp_domains_mod::mpp_get_boundary

Get the boundary data for symmetric domain when the data is at C, E, or N-cell center.
mpp_get_boundary is used to get the boundary data for symmetric domain when the data is at C, E, or N-cell center. For cubic grid, the data should always at C-cell center.
Example usage:

               call mpp_get_boundary(domain, field, ebuffer, sbuffer, wbuffer, nbuffer)

Get boundary information from domain and field and store in buffers

Definition at line 1624 of file mpp_domains.F90.

Public Member Functions

 mpp_get_boundary_r4_2d
 
 mpp_get_boundary_r4_2dv
 
 mpp_get_boundary_r4_3d
 
 mpp_get_boundary_r4_3dv
 
 mpp_get_boundary_r8_2d
 
 mpp_get_boundary_r8_2dv
 
 mpp_get_boundary_r8_3d
 
 mpp_get_boundary_r8_3dv
 

◆ mpp_domains_mod::mpp_get_boundary_ad

interface mpp_domains_mod::mpp_get_boundary_ad

Definition at line 1644 of file mpp_domains.F90.

Public Member Functions

 mpp_get_boundary_ad_r4_2d
 
 mpp_get_boundary_ad_r4_2dv
 
 mpp_get_boundary_ad_r4_3d
 
 mpp_get_boundary_ad_r4_3dv
 
 mpp_get_boundary_ad_r8_2d
 
 mpp_get_boundary_ad_r8_2dv
 
 mpp_get_boundary_ad_r8_3d
 
 mpp_get_boundary_ad_r8_3dv
 

◆ mpp_domains_mod::mpp_get_compute_domain

interface mpp_domains_mod::mpp_get_compute_domain

These routines retrieve the axis specifications associated with the compute domains. The domain is a derived type with private elements. These routines retrieve the axis specifications associated with the compute domains The 2D version of these is a simple extension of 1D.
Example usage:

       call mpp_get_compute_domain(domain_1D, is, ie)
       call mpp_get_compute_domain(domain_2D, is, ie, js, je)

Definition at line 2192 of file mpp_domains.F90.

Public Member Functions

 mpp_get_compute_domain1d
 
 mpp_get_compute_domain2d
 

◆ mpp_domains_mod::mpp_get_compute_domains

interface mpp_domains_mod::mpp_get_compute_domains

Retrieve the entire array of compute domain extents associated with a decomposition.

Parameters
domain2D domain
[out]xbegin,ybeginx and y domain starting indices
[out]xsize,ysizex and y domain sizes
Example usage:
       call mpp_get_compute_domains( domain, xbegin, xend, xsize, &
                                           ybegin, yend, ysize )

Definition at line 2207 of file mpp_domains.F90.

Public Member Functions

 mpp_get_compute_domains1d
 
 mpp_get_compute_domains2d
 

◆ mpp_domains_mod::mpp_get_data_domain

interface mpp_domains_mod::mpp_get_data_domain

These routines retrieve the axis specifications associated with the data domains. The domain is a derived type with private elements. These routines retrieve the axis specifications associated with the data domains. The 2D version of these is a simple extension of 1D.
Example usage:

               call mpp_get_data_domain(domain_1d, isd, ied)
               call mpp_get_data_domain(domain_2d, isd, ied, jsd, jed)

Definition at line 2227 of file mpp_domains.F90.

Public Member Functions

 mpp_get_data_domain1d
 
 mpp_get_data_domain2d
 

◆ mpp_domains_mod::mpp_get_domain_extents

interface mpp_domains_mod::mpp_get_domain_extents

Definition at line 2261 of file mpp_domains.F90.

Public Member Functions

 mpp_get_domain_extents1d
 
 mpp_get_domain_extents2d
 

◆ mpp_domains_mod::mpp_get_f2c_index

interface mpp_domains_mod::mpp_get_f2c_index

Get the index of the data passed from fine grid to coarse grid.
Example usage:

       call mpp_get_F2C_index(nest_domain, is_coarse, ie_coarse, js_coarse, je_coarse,
                       is_fine, ie_fine, js_fine, je_fine, nest_level, position)

Definition at line 1492 of file mpp_domains.F90.

Public Member Functions

 mpp_get_f2c_index_coarse
 
 mpp_get_f2c_index_fine
 

◆ mpp_domains_mod::mpp_get_global_domain

interface mpp_domains_mod::mpp_get_global_domain

These routines retrieve the axis specifications associated with the global domains. The domain is a derived type with private elements. These routines retrieve the axis specifications associated with the global domains. The 2D version of these is a simple extension of 1D.
Example usage:

               call mpp_get_global_domain(domain_1d, isg, ieg)
               call mpp_get_global_domain(domain_2d, isg, ieg, jsg, jeg)

Definition at line 2241 of file mpp_domains.F90.

Public Member Functions

 mpp_get_global_domain1d
 
 mpp_get_global_domain2d
 

◆ mpp_domains_mod::mpp_get_global_domains

interface mpp_domains_mod::mpp_get_global_domains

Definition at line 2213 of file mpp_domains.F90.

Public Member Functions

 mpp_get_global_domains1d
 
 mpp_get_global_domains2d
 

◆ mpp_domains_mod::mpp_get_layout

interface mpp_domains_mod::mpp_get_layout

Retrieve layout associated with a domain decomposition The 1D version of this call returns the number of divisions that was assigned to this decomposition axis. The 2D version of this call returns an array of dimension 2 holding the results on two axes.
Example usage:

               call mpp_get_layout( domain, layout )

Definition at line 2329 of file mpp_domains.F90.

Public Member Functions

 mpp_get_layout1d
 
 mpp_get_layout2d
 

◆ mpp_domains_mod::mpp_get_memory_domain

interface mpp_domains_mod::mpp_get_memory_domain

These routines retrieve the axis specifications associated with the memory domains. The domain is a derived type with private elements. These routines retrieve the axis specifications associated with the memory domains. The 2D version of these is a simple extension of 1D.
Example usage:

               call mpp_get_memory_domain(domain_1d, ism, iem)
               call mpp_get_memory_domain(domain_2d, ism, iem, jsm, jem)

Definition at line 2255 of file mpp_domains.F90.

Public Member Functions

 mpp_get_memory_domain1d
 
 mpp_get_memory_domain2d
 

◆ mpp_domains_mod::mpp_get_neighbor_pe

interface mpp_domains_mod::mpp_get_neighbor_pe

Retrieve PE number of a neighboring domain.

Given a 1-D or 2-D domain decomposition, this call allows users to retrieve the PE number of an adjacent PE-domain while taking into account that the domain may have holes (masked) and/or have cyclic boundary conditions and/or a folded edge. Which PE-domain will be retrived will depend on "direction": +1 (right) or -1 (left) for a 1-D domain decomposition and either NORTH, SOUTH, EAST, WEST, NORTH_EAST, SOUTH_EAST, SOUTH_WEST, or NORTH_WEST for a 2-D decomposition. If no neighboring domain exists (masked domain), then the returned "pe" value will be set to NULL_PE.

Example usage:

               call mpp_get_neighbor_pe( domain1d, direction=+1   , pe)

Set pe to the neighbor pe number that is to the right of the current pe

               call mpp_get_neighbor_pe( domain2d, direction=NORTH, pe)

Get neighbor pe number that's above/north of the current pe

Definition at line 2147 of file mpp_domains.F90.

Public Member Functions

 mpp_get_neighbor_pe_1d
 
 mpp_get_neighbor_pe_2d
 

◆ mpp_domains_mod::mpp_get_pelist

interface mpp_domains_mod::mpp_get_pelist

Retrieve list of PEs associated with a domain decomposition. The 1D version of this call returns an array of the PEs assigned to this 1D domain decomposition. In addition the optional argument pos may be used to retrieve the 0-based position of the domain local to the calling PE, i.e., domain%list(pos)%pe is the local PE, as returned by mpp_pe(). The 2D version of this call is identical to 1D version.

Definition at line 2316 of file mpp_domains.F90.

Public Member Functions

 mpp_get_pelist1d
 
 mpp_get_pelist2d
 

◆ mpp_domains_mod::mpp_global_field

interface mpp_domains_mod::mpp_global_field

Fill in a global array from domain-decomposed arrays.

mpp_global_field is used to get an entire domain-decomposed array on each PE. MPP_TYPE_ can be of type complex, integer, logical or real; of 4-byte or 8-byte kind; of rank up to 5.

All PEs in a domain decomposition must call mpp_global_field, and each will have a complete global field at the end. Please note that a global array of rank 3 or higher could occupy a lot of memory.

Parameters
domain2D domain
localData dimensioned on either the compute or data domains of 'domain'
[out]globaloutput data dimensioned on the corresponding global domain
flagscan be either XONLY or YONLY parameters to specify a globalization on one axis only


Example usage:

call mpp_global_field( domain, local, global, flags )

Definition at line 1784 of file mpp_domains.F90.

Public Member Functions

 mpp_global_field2d_c4_2d
 
 mpp_global_field2d_c4_3d
 
 mpp_global_field2d_c4_4d
 
 mpp_global_field2d_c4_5d
 
 mpp_global_field2d_c8_2d
 
 mpp_global_field2d_c8_3d
 
 mpp_global_field2d_c8_4d
 
 mpp_global_field2d_c8_5d
 
 mpp_global_field2d_i4_2d
 
 mpp_global_field2d_i4_3d
 
 mpp_global_field2d_i4_4d
 
 mpp_global_field2d_i4_5d
 
 mpp_global_field2d_i8_2d
 
 mpp_global_field2d_i8_3d
 
 mpp_global_field2d_i8_4d
 
 mpp_global_field2d_i8_5d
 
 mpp_global_field2d_l4_2d
 
 mpp_global_field2d_l4_3d
 
 mpp_global_field2d_l4_4d
 
 mpp_global_field2d_l4_5d
 
 mpp_global_field2d_l8_2d
 
 mpp_global_field2d_l8_3d
 
 mpp_global_field2d_l8_4d
 
 mpp_global_field2d_l8_5d
 
 mpp_global_field2d_r4_2d
 
 mpp_global_field2d_r4_3d
 
 mpp_global_field2d_r4_4d
 
 mpp_global_field2d_r4_5d
 
 mpp_global_field2d_r8_2d
 
 mpp_global_field2d_r8_3d
 
 mpp_global_field2d_r8_4d
 
 mpp_global_field2d_r8_5d
 

◆ mpp_domains_mod::mpp_global_field_ad

interface mpp_domains_mod::mpp_global_field_ad

Definition at line 1824 of file mpp_domains.F90.

Public Member Functions

 mpp_global_field2d_c4_2d_ad
 
 mpp_global_field2d_c4_3d_ad
 
 mpp_global_field2d_c4_4d_ad
 
 mpp_global_field2d_c4_5d_ad
 
 mpp_global_field2d_c8_2d_ad
 
 mpp_global_field2d_c8_3d_ad
 
 mpp_global_field2d_c8_4d_ad
 
 mpp_global_field2d_c8_5d_ad
 
 mpp_global_field2d_i4_2d_ad
 
 mpp_global_field2d_i4_3d_ad
 
 mpp_global_field2d_i4_4d_ad
 
 mpp_global_field2d_i4_5d_ad
 
 mpp_global_field2d_i8_2d_ad
 
 mpp_global_field2d_i8_3d_ad
 
 mpp_global_field2d_i8_4d_ad
 
 mpp_global_field2d_i8_5d_ad
 
 mpp_global_field2d_l4_2d_ad
 
 mpp_global_field2d_l4_3d_ad
 
 mpp_global_field2d_l4_4d_ad
 
 mpp_global_field2d_l4_5d_ad
 
 mpp_global_field2d_l8_2d_ad
 
 mpp_global_field2d_l8_3d_ad
 
 mpp_global_field2d_l8_4d_ad
 
 mpp_global_field2d_l8_5d_ad
 
 mpp_global_field2d_r4_2d_ad
 
 mpp_global_field2d_r4_3d_ad
 
 mpp_global_field2d_r4_4d_ad
 
 mpp_global_field2d_r4_5d_ad
 
 mpp_global_field2d_r8_2d_ad
 
 mpp_global_field2d_r8_3d_ad
 
 mpp_global_field2d_r8_4d_ad
 
 mpp_global_field2d_r8_5d_ad
 

◆ mpp_domains_mod::mpp_global_field_ug

interface mpp_domains_mod::mpp_global_field_ug

Same functionality as mpp_global_field but for unstructured domains.

Definition at line 1897 of file mpp_domains.F90.

Public Member Functions

 mpp_global_field2d_ug_i4_2d
 
 mpp_global_field2d_ug_i4_3d
 
 mpp_global_field2d_ug_i4_4d
 
 mpp_global_field2d_ug_i4_5d
 
 mpp_global_field2d_ug_i8_2d
 
 mpp_global_field2d_ug_i8_3d
 
 mpp_global_field2d_ug_i8_4d
 
 mpp_global_field2d_ug_i8_5d
 
 mpp_global_field2d_ug_r4_2d
 
 mpp_global_field2d_ug_r4_3d
 
 mpp_global_field2d_ug_r4_4d
 
 mpp_global_field2d_ug_r4_5d
 
 mpp_global_field2d_ug_r8_2d
 
 mpp_global_field2d_ug_r8_3d
 
 mpp_global_field2d_ug_r8_4d
 
 mpp_global_field2d_ug_r8_5d
 

◆ mpp_domains_mod::mpp_global_max

interface mpp_domains_mod::mpp_global_max

Global max of domain-decomposed arrays.
mpp_global_max is used to get the maximum value of a domain-decomposed array on each PE. MPP_TYPE_can be of type integer or real; of 4-byte or 8-byte kind; of rank up to 5. The dimension of locus must equal the rank of field.

All PEs in a domain decomposition must call mpp_global_max, and each will have the result upon exit. The function mpp_global_min, with an identical syntax. is also available.

Parameters
domain2D domain
fieldfield data dimensioned on either the compute or data domains of 'domain'
locusIf present, van be used to retrieve the location of the maximum


Example usage: mpp_global_max( domain, field, locus )

Definition at line 1949 of file mpp_domains.F90.

Public Member Functions

 mpp_global_max_i4_2d
 
 mpp_global_max_i4_3d
 
 mpp_global_max_i4_4d
 
 mpp_global_max_i4_5d
 
 mpp_global_max_i8_2d
 
 mpp_global_max_i8_3d
 
 mpp_global_max_i8_4d
 
 mpp_global_max_i8_5d
 
 mpp_global_max_r4_2d
 
 mpp_global_max_r4_3d
 
 mpp_global_max_r4_4d
 
 mpp_global_max_r4_5d
 
 mpp_global_max_r8_2d
 
 mpp_global_max_r8_3d
 
 mpp_global_max_r8_4d
 
 mpp_global_max_r8_5d
 

◆ mpp_domains_mod::mpp_global_min

interface mpp_domains_mod::mpp_global_min

Global min of domain-decomposed arrays.
mpp_global_min is used to get the minimum value of a domain-decomposed array on each PE. MPP_TYPE_can be of type integer or real; of 4-byte or 8-byte kind; of rank up to 5. The dimension of locus must equal the rank of field.

All PEs in a domain decomposition must call mpp_global_min, and each will have the result upon exit. The function mpp_global_max, with an identical syntax. is also available.

Parameters
domain2D domain
fieldfield data dimensioned on either the compute or data domains of 'domain'
locusIf present, van be used to retrieve the location of the minimum


Example usage: mpp_global_min( domain, field, locus )

Definition at line 1985 of file mpp_domains.F90.

Public Member Functions

 mpp_global_min_i4_2d
 
 mpp_global_min_i4_3d
 
 mpp_global_min_i4_4d
 
 mpp_global_min_i4_5d
 
 mpp_global_min_i8_2d
 
 mpp_global_min_i8_3d
 
 mpp_global_min_i8_4d
 
 mpp_global_min_i8_5d
 
 mpp_global_min_r4_2d
 
 mpp_global_min_r4_3d
 
 mpp_global_min_r4_4d
 
 mpp_global_min_r4_5d
 
 mpp_global_min_r8_2d
 
 mpp_global_min_r8_3d
 
 mpp_global_min_r8_4d
 
 mpp_global_min_r8_5d
 

◆ mpp_domains_mod::mpp_global_sum

interface mpp_domains_mod::mpp_global_sum

Global sum of domain-decomposed arrays.
mpp_global_sum is used to get the sum of a domain-decomposed array on each PE. MPP_TYPE_ can be of type integer, complex, or real; of 4-byte or 8-byte kind; of rank up to 5.

Parameters
domain2D domain
fieldfield data dimensioned on either the compute or data domain of 'domain'
flagsIf present must have the value BITWISE_EXACT_SUM. This produces a sum that is guaranteed to produce the identical result irrespective of how the domain is decomposed. This method does the sum first along the ranks beyond 2, and then calls mpp_global_field to produce a global 2D array which is then summed. The default method, which is considerably faster, does a local sum followed by mpp_sum across the domain decomposition.


Example usage: call mpp_global_sum( domain, field, flags )

Note
All PEs in a domain decomposition must call mpp_global_sum, and each will have the result upon exit.

Definition at line 2023 of file mpp_domains.F90.

Public Member Functions

 mpp_global_sum_c4_2d
 
 mpp_global_sum_c4_3d
 
 mpp_global_sum_c4_4d
 
 mpp_global_sum_c4_5d
 
 mpp_global_sum_c8_2d
 
 mpp_global_sum_c8_3d
 
 mpp_global_sum_c8_4d
 
 mpp_global_sum_c8_5d
 
 mpp_global_sum_i4_2d
 
 mpp_global_sum_i4_3d
 
 mpp_global_sum_i4_4d
 
 mpp_global_sum_i4_5d
 
 mpp_global_sum_i8_2d
 
 mpp_global_sum_i8_3d
 
 mpp_global_sum_i8_4d
 
 mpp_global_sum_i8_5d
 
 mpp_global_sum_r4_2d
 
 mpp_global_sum_r4_3d
 
 mpp_global_sum_r4_4d
 
 mpp_global_sum_r4_5d
 
 mpp_global_sum_r8_2d
 
 mpp_global_sum_r8_3d
 
 mpp_global_sum_r8_4d
 
 mpp_global_sum_r8_5d
 

◆ mpp_domains_mod::mpp_global_sum_ad

interface mpp_domains_mod::mpp_global_sum_ad

Definition at line 2090 of file mpp_domains.F90.

Public Member Functions

 mpp_global_sum_ad_c4_2d
 
 mpp_global_sum_ad_c4_3d
 
 mpp_global_sum_ad_c4_4d
 
 mpp_global_sum_ad_c4_5d
 
 mpp_global_sum_ad_c8_2d
 
 mpp_global_sum_ad_c8_3d
 
 mpp_global_sum_ad_c8_4d
 
 mpp_global_sum_ad_c8_5d
 
 mpp_global_sum_ad_i4_2d
 
 mpp_global_sum_ad_i4_3d
 
 mpp_global_sum_ad_i4_4d
 
 mpp_global_sum_ad_i4_5d
 
 mpp_global_sum_ad_i8_2d
 
 mpp_global_sum_ad_i8_3d
 
 mpp_global_sum_ad_i8_4d
 
 mpp_global_sum_ad_i8_5d
 
 mpp_global_sum_ad_r4_2d
 
 mpp_global_sum_ad_r4_3d
 
 mpp_global_sum_ad_r4_4d
 
 mpp_global_sum_ad_r4_5d
 
 mpp_global_sum_ad_r8_2d
 
 mpp_global_sum_ad_r8_3d
 
 mpp_global_sum_ad_r8_4d
 
 mpp_global_sum_ad_r8_5d
 

◆ mpp_domains_mod::mpp_global_sum_tl

interface mpp_domains_mod::mpp_global_sum_tl

Definition at line 2056 of file mpp_domains.F90.

Public Member Functions

 mpp_global_sum_tl_c4_2d
 
 mpp_global_sum_tl_c4_3d
 
 mpp_global_sum_tl_c4_4d
 
 mpp_global_sum_tl_c4_5d
 
 mpp_global_sum_tl_c8_2d
 
 mpp_global_sum_tl_c8_3d
 
 mpp_global_sum_tl_c8_4d
 
 mpp_global_sum_tl_c8_5d
 
 mpp_global_sum_tl_i4_2d
 
 mpp_global_sum_tl_i4_3d
 
 mpp_global_sum_tl_i4_4d
 
 mpp_global_sum_tl_i4_5d
 
 mpp_global_sum_tl_i8_2d
 
 mpp_global_sum_tl_i8_3d
 
 mpp_global_sum_tl_i8_4d
 
 mpp_global_sum_tl_i8_5d
 
 mpp_global_sum_tl_r4_2d
 
 mpp_global_sum_tl_r4_3d
 
 mpp_global_sum_tl_r4_4d
 
 mpp_global_sum_tl_r4_5d
 
 mpp_global_sum_tl_r8_2d
 
 mpp_global_sum_tl_r8_3d
 
 mpp_global_sum_tl_r8_4d
 
 mpp_global_sum_tl_r8_5d
 

◆ mpp_domains_mod::mpp_group_update_type

type mpp_domains_mod::mpp_group_update_type

used for updates on a group

Definition at line 575 of file mpp_domains.F90.

Collaboration diagram for mpp_group_update_type:
[legend]

Public Attributes

integer(i8_kind), dimension(max_domain_fields) addrs_s
 
integer(i8_kind), dimension(max_domain_fields) addrs_x
 
integer(i8_kind), dimension(max_domain_fields) addrs_y
 
integer, dimension(maxoverlap) buffer_pos_recv
 
integer, dimension(maxoverlap) buffer_pos_send
 
integer buffer_start_pos = -1
 
integer ehalo_s =0
 
integer ehalo_v =0
 
integer flags_s =0
 
integer flags_v =0
 
integer, dimension(maxoverlap) from_pe
 
integer gridtype =0
 
integer ie_s =0
 
integer ie_x =0
 
integer ie_y =0
 
integer is_s =0
 
integer is_x =0
 
integer is_y =0
 
integer isize_s =0
 
integer isize_x =0
 
integer isize_y =0
 
integer je_s =0
 
integer je_x =0
 
integer je_y =0
 
integer js_s =0
 
integer js_x =0
 
integer js_y =0
 
integer jsize_s =0
 
integer jsize_x =0
 
integer jsize_y =0
 
logical k_loop_inside = .TRUE.
 
integer ksize_s =1
 
integer ksize_v =1
 
integer nhalo_s =0
 
integer nhalo_v =0
 
logical nonsym_edge = .FALSE.
 
integer npack =0
 
integer nrecv =0
 
integer nscalar = 0
 
integer nsend =0
 
integer nunpack =0
 
integer nvector = 0
 
integer, dimension(maxoverlap) pack_buffer_pos
 
integer, dimension(maxoverlap) pack_ie
 
integer, dimension(maxoverlap) pack_is
 
integer, dimension(maxoverlap) pack_je
 
integer, dimension(maxoverlap) pack_js
 
integer, dimension(maxoverlap) pack_rotation
 
integer, dimension(maxoverlap) pack_size
 
integer, dimension(maxoverlap) pack_type
 
integer position =0
 
logical, dimension(8) recv_s
 
integer, dimension(maxoverlap) recv_size
 
logical, dimension(8) recv_x
 
logical, dimension(8) recv_y
 
integer, dimension(max_request) request_recv
 
integer, dimension(max_request) request_send
 
integer reset_index_s = 0
 
integer reset_index_v = 0
 
integer, dimension(maxoverlap) send_size
 
integer shalo_s =0
 
integer shalo_v =0
 
integer, dimension(maxoverlap) to_pe
 
integer tot_msgsize = 0
 
integer, dimension(max_request) type_recv
 
integer, dimension(maxoverlap) unpack_buffer_pos
 
integer, dimension(maxoverlap) unpack_ie
 
integer, dimension(maxoverlap) unpack_is
 
integer, dimension(maxoverlap) unpack_je
 
integer, dimension(maxoverlap) unpack_js
 
integer, dimension(maxoverlap) unpack_rotation
 
integer, dimension(maxoverlap) unpack_size
 
integer, dimension(maxoverlap) unpack_type
 
integer whalo_s =0
 
integer whalo_v =0
 

Private Attributes

logical initialized = .FALSE.
 

Member Data Documentation

◆ addrs_s

integer(i8_kind), dimension(max_domain_fields) addrs_s

Definition at line 620 of file mpp_domains.F90.

◆ addrs_x

integer(i8_kind), dimension(max_domain_fields) addrs_x

Definition at line 621 of file mpp_domains.F90.

◆ addrs_y

integer(i8_kind), dimension(max_domain_fields) addrs_y

Definition at line 622 of file mpp_domains.F90.

◆ buffer_pos_recv

integer, dimension(maxoverlap) buffer_pos_recv

Definition at line 602 of file mpp_domains.F90.

◆ buffer_pos_send

integer, dimension(maxoverlap) buffer_pos_send

Definition at line 603 of file mpp_domains.F90.

◆ buffer_start_pos

integer buffer_start_pos = -1

Definition at line 623 of file mpp_domains.F90.

◆ ehalo_s

integer ehalo_s =0

Definition at line 583 of file mpp_domains.F90.

◆ ehalo_v

integer ehalo_v =0

Definition at line 585 of file mpp_domains.F90.

◆ flags_s

integer flags_s =0

Definition at line 582 of file mpp_domains.F90.

◆ flags_v

integer flags_v =0

Definition at line 582 of file mpp_domains.F90.

◆ from_pe

integer, dimension(maxoverlap) from_pe

Definition at line 598 of file mpp_domains.F90.

◆ gridtype

integer gridtype =0

Definition at line 588 of file mpp_domains.F90.

◆ ie_s

integer ie_s =0

Definition at line 590 of file mpp_domains.F90.

◆ ie_x

integer ie_x =0

Definition at line 591 of file mpp_domains.F90.

◆ ie_y

integer ie_y =0

Definition at line 592 of file mpp_domains.F90.

◆ initialized

logical initialized = .FALSE.
private

Definition at line 577 of file mpp_domains.F90.

◆ is_s

integer is_s =0

Definition at line 590 of file mpp_domains.F90.

◆ is_x

integer is_x =0

Definition at line 591 of file mpp_domains.F90.

◆ is_y

integer is_y =0

Definition at line 592 of file mpp_domains.F90.

◆ isize_s

integer isize_s =0

Definition at line 584 of file mpp_domains.F90.

◆ isize_x

integer isize_x =0

Definition at line 586 of file mpp_domains.F90.

◆ isize_y

integer isize_y =0

Definition at line 587 of file mpp_domains.F90.

◆ je_s

integer je_s =0

Definition at line 590 of file mpp_domains.F90.

◆ je_x

integer je_x =0

Definition at line 591 of file mpp_domains.F90.

◆ je_y

integer je_y =0

Definition at line 592 of file mpp_domains.F90.

◆ js_s

integer js_s =0

Definition at line 590 of file mpp_domains.F90.

◆ js_x

integer js_x =0

Definition at line 591 of file mpp_domains.F90.

◆ js_y

integer js_y =0

Definition at line 592 of file mpp_domains.F90.

◆ jsize_s

integer jsize_s =0

Definition at line 584 of file mpp_domains.F90.

◆ jsize_x

integer jsize_x =0

Definition at line 586 of file mpp_domains.F90.

◆ jsize_y

integer jsize_y =0

Definition at line 587 of file mpp_domains.F90.

◆ k_loop_inside

logical k_loop_inside = .TRUE.

Definition at line 578 of file mpp_domains.F90.

◆ ksize_s

integer ksize_s =1

Definition at line 584 of file mpp_domains.F90.

◆ ksize_v

integer ksize_v =1

Definition at line 586 of file mpp_domains.F90.

◆ nhalo_s

integer nhalo_s =0

Definition at line 583 of file mpp_domains.F90.

◆ nhalo_v

integer nhalo_v =0

Definition at line 585 of file mpp_domains.F90.

◆ nonsym_edge

logical nonsym_edge = .FALSE.

Definition at line 579 of file mpp_domains.F90.

◆ npack

integer npack =0

Definition at line 594 of file mpp_domains.F90.

◆ nrecv

integer nrecv =0

Definition at line 593 of file mpp_domains.F90.

◆ nscalar

integer nscalar = 0

Definition at line 580 of file mpp_domains.F90.

◆ nsend

integer nsend =0

Definition at line 593 of file mpp_domains.F90.

◆ nunpack

integer nunpack =0

Definition at line 594 of file mpp_domains.F90.

◆ nvector

integer nvector = 0

Definition at line 581 of file mpp_domains.F90.

◆ pack_buffer_pos

integer, dimension(maxoverlap) pack_buffer_pos

Definition at line 605 of file mpp_domains.F90.

◆ pack_ie

integer, dimension(maxoverlap) pack_ie

Definition at line 609 of file mpp_domains.F90.

◆ pack_is

integer, dimension(maxoverlap) pack_is

Definition at line 608 of file mpp_domains.F90.

◆ pack_je

integer, dimension(maxoverlap) pack_je

Definition at line 611 of file mpp_domains.F90.

◆ pack_js

integer, dimension(maxoverlap) pack_js

Definition at line 610 of file mpp_domains.F90.

◆ pack_rotation

integer, dimension(maxoverlap) pack_rotation

Definition at line 606 of file mpp_domains.F90.

◆ pack_size

integer, dimension(maxoverlap) pack_size

Definition at line 607 of file mpp_domains.F90.

◆ pack_type

integer, dimension(maxoverlap) pack_type

Definition at line 604 of file mpp_domains.F90.

◆ position

integer position =0

Definition at line 588 of file mpp_domains.F90.

◆ recv_s

logical, dimension(8) recv_s

Definition at line 589 of file mpp_domains.F90.

◆ recv_size

integer, dimension(maxoverlap) recv_size

Definition at line 600 of file mpp_domains.F90.

◆ recv_x

logical, dimension(8) recv_x

Definition at line 589 of file mpp_domains.F90.

◆ recv_y

logical, dimension(8) recv_y

Definition at line 589 of file mpp_domains.F90.

◆ request_recv

integer, dimension(max_request) request_recv

Definition at line 625 of file mpp_domains.F90.

◆ request_send

integer, dimension(max_request) request_send

Definition at line 624 of file mpp_domains.F90.

◆ reset_index_s

integer reset_index_s = 0

Definition at line 595 of file mpp_domains.F90.

◆ reset_index_v

integer reset_index_v = 0

Definition at line 596 of file mpp_domains.F90.

◆ send_size

integer, dimension(maxoverlap) send_size

Definition at line 601 of file mpp_domains.F90.

◆ shalo_s

integer shalo_s =0

Definition at line 583 of file mpp_domains.F90.

◆ shalo_v

integer shalo_v =0

Definition at line 585 of file mpp_domains.F90.

◆ to_pe

integer, dimension(maxoverlap) to_pe

Definition at line 599 of file mpp_domains.F90.

◆ tot_msgsize

integer tot_msgsize = 0

Definition at line 597 of file mpp_domains.F90.

◆ type_recv

integer, dimension(max_request) type_recv

Definition at line 626 of file mpp_domains.F90.

◆ unpack_buffer_pos

integer, dimension(maxoverlap) unpack_buffer_pos

Definition at line 613 of file mpp_domains.F90.

◆ unpack_ie

integer, dimension(maxoverlap) unpack_ie

Definition at line 617 of file mpp_domains.F90.

◆ unpack_is

integer, dimension(maxoverlap) unpack_is

Definition at line 616 of file mpp_domains.F90.

◆ unpack_je

integer, dimension(maxoverlap) unpack_je

Definition at line 619 of file mpp_domains.F90.

◆ unpack_js

integer, dimension(maxoverlap) unpack_js

Definition at line 618 of file mpp_domains.F90.

◆ unpack_rotation

integer, dimension(maxoverlap) unpack_rotation

Definition at line 614 of file mpp_domains.F90.

◆ unpack_size

integer, dimension(maxoverlap) unpack_size

Definition at line 615 of file mpp_domains.F90.

◆ unpack_type

integer, dimension(maxoverlap) unpack_type

Definition at line 612 of file mpp_domains.F90.

◆ whalo_s

integer whalo_s =0

Definition at line 583 of file mpp_domains.F90.

◆ whalo_v

integer whalo_v =0

Definition at line 585 of file mpp_domains.F90.

◆ mpp_domains_mod::mpp_modify_domain

interface mpp_domains_mod::mpp_modify_domain

Modifies the extents (compute, data and global) of a given domain.

Definition at line 926 of file mpp_domains.F90.

Public Member Functions

 mpp_modify_domain1d
 
 mpp_modify_domain2d
 

◆ mpp_domains_mod::mpp_nullify_domain_list

interface mpp_domains_mod::mpp_nullify_domain_list

Nullify domain list. This interface is needed in mpp_domains_test. 1-D case can be added in if needed.
Example usage:

               call mpp_nullify_domain_list(domain)

Definition at line 2346 of file mpp_domains.F90.

Public Member Functions

 nullify_domain2d_list
 

◆ mpp_domains_mod::mpp_pass_sg_to_ug

interface mpp_domains_mod::mpp_pass_sg_to_ug

Passes data from a structured grid to an unstructured grid
Example usage:

       call mpp_pass_SG_to_UG(domain, sg_data, ug_data)

Definition at line 1575 of file mpp_domains.F90.

Public Member Functions

 mpp_pass_sg_to_ug_i4_2d
 
 mpp_pass_sg_to_ug_i4_3d
 
 mpp_pass_sg_to_ug_l4_2d
 
 mpp_pass_sg_to_ug_l4_3d
 
 mpp_pass_sg_to_ug_r4_2d
 
 mpp_pass_sg_to_ug_r4_3d
 
 mpp_pass_sg_to_ug_r8_2d
 
 mpp_pass_sg_to_ug_r8_3d
 

◆ mpp_domains_mod::mpp_pass_ug_to_sg

interface mpp_domains_mod::mpp_pass_ug_to_sg

Passes a data field from a structured grid to an unstructured grid
Example usage:

       call mpp_pass_SG_to_UG(SG_domain, field_SG, field_UG)

Definition at line 1591 of file mpp_domains.F90.

Public Member Functions

 mpp_pass_ug_to_sg_i4_2d
 
 mpp_pass_ug_to_sg_i4_3d
 
 mpp_pass_ug_to_sg_l4_2d
 
 mpp_pass_ug_to_sg_l4_3d
 
 mpp_pass_ug_to_sg_r4_2d
 
 mpp_pass_ug_to_sg_r4_3d
 
 mpp_pass_ug_to_sg_r8_2d
 
 mpp_pass_ug_to_sg_r8_3d
 

◆ mpp_domains_mod::mpp_redistribute

interface mpp_domains_mod::mpp_redistribute

Reorganization of distributed global arrays.
mpp_redistribute is used to reorganize a distributed array. MPP_TYPE_can be of type integer, complex, or real; of 4-byte or 8-byte kind; of rank up to 5.
Example usage: call mpp_redistribute( domain_in, field_in, domain_out, field_out )

Definition at line 1678 of file mpp_domains.F90.

Public Member Functions

 mpp_redistribute_c4_2d
 
 mpp_redistribute_c4_3d
 
 mpp_redistribute_c4_4d
 
 mpp_redistribute_c4_5d
 
 mpp_redistribute_c8_2d
 
 mpp_redistribute_c8_3d
 
 mpp_redistribute_c8_4d
 
 mpp_redistribute_c8_5d
 
 mpp_redistribute_i4_2d
 
 mpp_redistribute_i4_3d
 
 mpp_redistribute_i4_4d
 
 mpp_redistribute_i4_5d
 
 mpp_redistribute_i8_2d
 
 mpp_redistribute_i8_3d
 
 mpp_redistribute_i8_4d
 
 mpp_redistribute_i8_5d
 
 mpp_redistribute_r4_2d
 
 mpp_redistribute_r4_3d
 
 mpp_redistribute_r4_4d
 
 mpp_redistribute_r4_5d
 
 mpp_redistribute_r8_2d
 
 mpp_redistribute_r8_3d
 
 mpp_redistribute_r8_4d
 
 mpp_redistribute_r8_5d
 

◆ mpp_domains_mod::mpp_reset_group_update_field

interface mpp_domains_mod::mpp_reset_group_update_field

Definition at line 1360 of file mpp_domains.F90.

Public Member Functions

 mpp_reset_group_update_field_r4_2d
 
 mpp_reset_group_update_field_r4_2dv
 
 mpp_reset_group_update_field_r4_3d
 
 mpp_reset_group_update_field_r4_3dv
 
 mpp_reset_group_update_field_r4_4d
 
 mpp_reset_group_update_field_r4_4dv
 
 mpp_reset_group_update_field_r8_2d
 
 mpp_reset_group_update_field_r8_2dv
 
 mpp_reset_group_update_field_r8_3d
 
 mpp_reset_group_update_field_r8_3dv
 
 mpp_reset_group_update_field_r8_4d
 
 mpp_reset_group_update_field_r8_4dv
 

◆ mpp_domains_mod::mpp_set_compute_domain

interface mpp_domains_mod::mpp_set_compute_domain

These routines set the axis specifications associated with the compute domains. The domain is a derived type with private elements. These routines set the axis specifications associated with the compute domains The 2D version of these is a simple extension of 1D.
Example usage:

               call mpp_get_data_domain(domain_1d, isd, ied)
               call mpp_get_data_domain(domain_2d, isd, ied, jsd, jed)

Definition at line 2275 of file mpp_domains.F90.

Public Member Functions

 mpp_set_compute_domain1d
 
 mpp_set_compute_domain2d
 

◆ mpp_domains_mod::mpp_set_data_domain

interface mpp_domains_mod::mpp_set_data_domain

These routines set the axis specifications associated with the data domains. The domain is a derived type with private elements. These routines set the axis specifications associated with the data domains. The 2D version of these is a simple extension of 1D.
Example usage:

               call mpp_set_data_domain(domain_1d, isd, ied)
               call mpp_set_data_domain(domain_2d, isd, ied, jsd, jed)

Definition at line 2289 of file mpp_domains.F90.

Public Member Functions

 mpp_set_data_domain1d
 
 mpp_set_data_domain2d
 

◆ mpp_domains_mod::mpp_set_global_domain

interface mpp_domains_mod::mpp_set_global_domain

These routines set the axis specifications associated with the global domains. The domain is a derived type with private elements. These routines set the axis specifications associated with the global domains. The 2D version of these is a simple extension of 1D.
Example usage:

               call mpp_set_global_domain(domain_1d, isg, ieg)
               call mpp_set_global_domain(domain_2d, isg, ieg, jsg, jeg)

Definition at line 2303 of file mpp_domains.F90.

Public Member Functions

 mpp_set_global_domain1d
 
 mpp_set_global_domain2d
 

◆ mpp_domains_mod::mpp_start_do_update

interface mpp_domains_mod::mpp_start_do_update

Private interface used for non blocking updates.

Definition at line 1277 of file mpp_domains.F90.

Public Member Functions

 mpp_start_do_update_i4_3d
 
 mpp_start_do_update_i8_3d
 
 mpp_start_do_update_r4_3d
 
 mpp_start_do_update_r4_3dv
 
 mpp_start_do_update_r8_3d
 
 mpp_start_do_update_r8_3dv
 

◆ mpp_domains_mod::mpp_start_group_update

interface mpp_domains_mod::mpp_start_group_update

Starts non-blocking group update Must be followed up with a call to mpp_complete_group_update mpp_group_update_type can be created with mpp_create_group_update.

Parameters
[in,out]type(mpp_group_update_type)group type created for group update
[in,out]type(domain2D)domain to update

Definition at line 1342 of file mpp_domains.F90.

Public Member Functions

 mpp_start_group_update_r4
 
 mpp_start_group_update_r8
 

◆ mpp_domains_mod::mpp_start_update_domains

interface mpp_domains_mod::mpp_start_update_domains

Interface to start halo updates mpp_start_update_domains is used to start a halo update of a domain-decomposed array on each PE. MPP_TYPE_ can be of type complex, integer, logical or real; of 4-byte or 8-byte kind; of rank up to 5. The vector version (with two input data fields) is only present for \ereal types.

\empp_start_update_domains must be paired together with \empp_complete_update_domains. In mpp_start_update_domains, a buffer will be pre-post to receive (non-blocking) the data and data on computational domain will be packed and sent (non-blocking send) to other processor. In mpp_complete_update_domains, buffer will be unpacked to fill the halo and mpp_sync_self will be called to to ensure communication safe at the last call of mpp_complete_update_domains.

Each mpp_update_domains can be replaced by the combination of mpp_start_update_domains and mpp_complete_update_domains. The arguments in mpp_start_update_domains and mpp_complete_update_domains should be the exact the same as in mpp_update_domains to be replaced except no optional argument "complete". The following are examples on how to replace mpp_update_domains with mpp_start_update_domains/mpp_complete_update_domains.

Example 1: Replace one scalar mpp_update_domains.

Replace

call mpp_update_domains(data, domain, flags=update_flags)

with

id_update = mpp_start_update_domains(data, domain, flags=update_flags)
...( doing some computation )
call mpp_complete_update_domains(id_update, data, domain, flags=update_flags)

Example 2: Replace group scalar mpp_update_domains

Replace

call mpp_update_domains(data_1, domain, flags=update_flags, complete=.false.)
.... ( other n-2 call mpp_update_domains with complete = .false. )
call mpp_update_domains(data_n, domain, flags=update_flags, complete=.true. )

With

id_up_1 = mpp_start_update_domains(data_1, domain, flags=update_flags)
.... ( other n-2 call mpp_start_update_domains )
id_up_n = mpp_start_update_domains(data_n, domain, flags=update_flags)

..... ( doing some computation )

call mpp_complete_update_domains(id_up_1, data_1, domain, flags=update_flags)
.... ( other n-2 call mpp_complete_update_domains )
call mpp_complete_update_domains(id_up_n, data_n, domain, flags=update_flags)
Example 3: Replace group CGRID_NE vector, mpp_update_domains

Replace

call mpp_update_domains(u_1, v_1, domain, flags=update_flgs, gridtype=CGRID_NE, complete=.false.)
.... ( other n-2 call mpp_update_domains with complete = .false. )
call mpp_update_domains(u_1, v_1, domain, flags=update_flags, gridtype=CGRID_NE, complete=.true. )

with

id_up_1 = mpp_start_update_domains(u_1, v_1, domain, flags=update_flags, gridtype=CGRID_NE)
.... ( other n-2 call mpp_start_update_domains )
id_up_n = mpp_start_update_domains(u_n, v_n, domain, flags=update_flags, gridtype=CGRID_NE)

..... ( doing some computation )

call mpp_complete_update_domains(id_up_1, u_1, v_1, domain, flags=update_flags, gridtype=CGRID_NE)
.... ( other n-2 call mpp_complete_update_domains )
call mpp_complete_update_domains(id_up_n, u_n, v_n, domain, flags=update_flags, gridtype=CGRID_NE)

For 2D domain updates, if there are halos present along both x and y, we can choose to update one only, by specifying flags=XUPDATE or flags=YUPDATE. In addition, one-sided updates can be performed by setting flags to any combination of WUPDATE, EUPDATE, SUPDATE and NUPDATE, to update the west, east, north and south halos respectively. Any combination of halos may be used by adding the requisite flags, e.g: flags=XUPDATE+SUPDATE or flags=EUPDATE+WUPDATE+SUPDATE will update the east, west and south halos.

If a call to mpp_start_update_domains/mpp_complete_update_domains involves at least one E-W halo and one N-S halo, the corners involved will also be updated, i.e, in the example above, the SE and SW corners will be updated.

If flags is not supplied, that is equivalent to flags=XUPDATE+YUPDATE.

The vector version is passed the x and y components of a vector field in tandem, and both are updated upon return. They are passed together to treat parity issues on various grids. For example, on a cubic sphere projection, the x and y components may be interchanged when passing from an equatorial cube face to a polar face. For grids with folds, vector components change sign on crossing the fold. Paired scalar quantities can also be passed with the vector version if flags=SCALAR_PAIR, in which case components are appropriately interchanged, but signs are not.

Special treatment at boundaries such as folds is also required for staggered grids. The following types of staggered grids are recognized:
1) AGRID: values are at grid centers.
2) BGRID_NE: vector fields are at the NE vertex of a grid cell, i.e: the array elements u(i,j) and v(i,j) are actually at (i+½,j+½) with respect to the grid centers.
3) BGRID_SW: vector fields are at the SW vertex of a grid cell, i.e., the array elements u(i,j) and v(i,j) are actually at (i-½,j-½) with respect to the grid centers.
4) CGRID_NE: vector fields are at the N and E faces of a grid cell, i.e: the array elements u(i,j) and v(i,j) are actually at (i+½,j) and (i,j+½) with respect to the grid centers.
5) CGRID_SW: vector fields are at the S and W faces of a grid cell, i.e: the array elements u(i,j) and v(i,j) are actually at (i-½,j) and (i,j-½) with respect to the grid centers.

The gridtypes listed above are all available by use association as integer parameters. If vector fields are at staggered locations, the optional argument gridtype must be appropriately set for correct treatment at boundaries.
It is safe to apply vector field updates to the appropriate arrays irrespective of the domain topology: if the topology requires no special treatment of vector fields, specifying gridtype will do no harm.

mpp_start_update_domains/mpp_complete_update_domains internally buffers the data being sent and received into single messages for efficiency. A turnable internal buffer area in memory is provided for this purpose by mpp_domains_mod. The size of this buffer area can be set by the user by calling mpp_domains_set_stack_size.
Example usage:
            call mpp_start_update_domains( field, domain, flags )
            call mpp_complete_update_domains( field, domain, flags )

Definition at line 1193 of file mpp_domains.F90.

Public Member Functions

 mpp_start_update_domain2d_i4_2d
 
 mpp_start_update_domain2d_i4_3d
 
 mpp_start_update_domain2d_i4_4d
 
 mpp_start_update_domain2d_i4_5d
 
 mpp_start_update_domain2d_i8_2d
 
 mpp_start_update_domain2d_i8_3d
 
 mpp_start_update_domain2d_i8_4d
 
 mpp_start_update_domain2d_i8_5d
 
 mpp_start_update_domain2d_r4_2d
 
 mpp_start_update_domain2d_r4_2dv
 
 mpp_start_update_domain2d_r4_3d
 
 mpp_start_update_domain2d_r4_3dv
 
 mpp_start_update_domain2d_r4_4d
 
 mpp_start_update_domain2d_r4_4dv
 
 mpp_start_update_domain2d_r4_5d
 
 mpp_start_update_domain2d_r4_5dv
 
 mpp_start_update_domain2d_r8_2d
 
 mpp_start_update_domain2d_r8_2dv
 
 mpp_start_update_domain2d_r8_3d
 
 mpp_start_update_domain2d_r8_3dv
 
 mpp_start_update_domain2d_r8_4d
 
 mpp_start_update_domain2d_r8_4dv
 
 mpp_start_update_domain2d_r8_5d
 
 mpp_start_update_domain2d_r8_5dv
 

◆ mpp_domains_mod::mpp_update_domains

interface mpp_domains_mod::mpp_update_domains

Performs halo updates for a given domain.

Used to perform a halo update of a domain-decomposed array on each PE. MPP_TYPE can be of type complex, integer, logical or real of 4-byte or 8-byte kind; of rank up to 5. The vector version (with two input data fields) is only present for real types. For 2D domain updates, if there are halos present along both x and y, we can choose to update one only, by specifying flags=XUPDATE or flags=YUPDATE. In addition, one-sided updates can be performed by setting flags to any combination of WUPDATE, EUPDATE, SUPDATE and NUPDATE to update the west, east, north and south halos respectively. Any combination of halos may be used by adding the requisite flags, e.g: flags=XUPDATE+SUPDATE or flags=EUPDATE+WUPDATE+SUPDATE will update the east, west and south halos.

If a call to mpp_update_domains involves at least one E-W halo and one N-S halo, the corners involved will also be updated, i.e, in the example above, the SE and SW corners will be updated.
If flags is not supplied, that is equivalent to flags=XUPDATE+YUPDATE.

The vector version is passed the x and y components of a vector field in tandem, and both are updated upon return. They are passed together to treat parity issues on various grids. For example, on a cubic sphere projection, the x y components may be interchanged when passing from an equatorial cube face to a polar face. For grids with folds, vector components change sign on crossing the fold. Paired scalar quantities can also be passed with the vector version if flags=SCALAR_PAIR, in which case components are appropriately interchanged, but signs are not.

Special treatment at boundaries such as folds is also required for staggered grids. The following types of staggered grids are recognized:

1) AGRID: values are at grid centers.
2) BGRID_NE: vector fields are at the NE vertex of a grid cell, i.e: the array elements u(i,j)and v(i,j)are actually at (i,j;) with respect to the grid centers.
3) BGRID_SW: vector fields are at the SW vertex of a grid cell, i.e: the array elements u(i,j) and v(i,j) are actually at (i;,j;) with respect to the grid centers
4) CGRID_NE: vector fields are at the N and E faces of a grid cell, i.e: the array elements u(i,j) and v(i,j) are actually at (i;,j) and (i,j+½) with respect to the grid centers.
5) CGRID_SW: vector fields are at the S and W faces of a grid cell, i.e: the array elements u(i,j)and v(i,j) are actually at (i;,j) and (i,j;) with respect to the grid centers.

The gridtypes listed above are all available by use association as integer parameters. The scalar version of mpp_update_domains assumes that the values of a scalar field are always at AGRID locations, and no special boundary treatment is required. If vector fields are at staggered locations, the optional argument gridtype must be appropriately set for correct treatment at boundaries.
It is safe to apply vector field updates to the appropriate arrays irrespective of the domain topology: if the topology requires no special treatment of vector fields, specifying gridtype will do no harm.

mpp_update_domains internally buffers the date being sent and received into single messages for efficiency. A turnable internal buffer area in memory is provided for this purpose by mpp_domains_mod. The size of this buffer area can be set by the user by calling mpp_domains mpp_domains_set_stack_size.

Example usage: call mpp_update_domains( field, domain, flags ) Update a 1D domain for the given field. call mpp_update_domains( fieldx, fieldy, domain, flags, gridtype ) Update a 2D domain for the given fields.

Definition at line 1012 of file mpp_domains.F90.

Public Member Functions

 mpp_update_domain2d_i4_2d
 
 mpp_update_domain2d_i4_3d
 
 mpp_update_domain2d_i4_4d
 
 mpp_update_domain2d_i4_5d
 
 mpp_update_domain2d_i8_2d
 
 mpp_update_domain2d_i8_3d
 
 mpp_update_domain2d_i8_4d
 
 mpp_update_domain2d_i8_5d
 
 mpp_update_domain2d_r4_2d
 
 mpp_update_domain2d_r4_2dv
 
 mpp_update_domain2d_r4_3d
 
 mpp_update_domain2d_r4_3dv
 
 mpp_update_domain2d_r4_4d
 
 mpp_update_domain2d_r4_4dv
 
 mpp_update_domain2d_r4_5d
 
 mpp_update_domain2d_r4_5dv
 
 mpp_update_domain2d_r8_2d
 
 mpp_update_domain2d_r8_2dv
 
 mpp_update_domain2d_r8_3d
 
 mpp_update_domain2d_r8_3dv
 
 mpp_update_domain2d_r8_4d
 
 mpp_update_domain2d_r8_4dv
 
 mpp_update_domain2d_r8_5d
 
 mpp_update_domain2d_r8_5dv
 

◆ mpp_domains_mod::mpp_update_domains_ad

interface mpp_domains_mod::mpp_update_domains_ad

Similar to mpp_update_domains , updates adjoint domains.

Definition at line 1518 of file mpp_domains.F90.

Public Member Functions

 mpp_update_domains_ad_2d_r4_2d
 
 mpp_update_domains_ad_2d_r4_2dv
 
 mpp_update_domains_ad_2d_r4_3d
 
 mpp_update_domains_ad_2d_r4_3dv
 
 mpp_update_domains_ad_2d_r4_4d
 
 mpp_update_domains_ad_2d_r4_4dv
 
 mpp_update_domains_ad_2d_r4_5d
 
 mpp_update_domains_ad_2d_r4_5dv
 
 mpp_update_domains_ad_2d_r8_2d
 
 mpp_update_domains_ad_2d_r8_2dv
 
 mpp_update_domains_ad_2d_r8_3d
 
 mpp_update_domains_ad_2d_r8_3dv
 
 mpp_update_domains_ad_2d_r8_4d
 
 mpp_update_domains_ad_2d_r8_4dv
 
 mpp_update_domains_ad_2d_r8_5d
 
 mpp_update_domains_ad_2d_r8_5dv
 

◆ mpp_domains_mod::mpp_update_nest_coarse

interface mpp_domains_mod::mpp_update_nest_coarse

Pass the data from fine grid to fill the buffer to be ready to be interpolated onto coarse grid.
Example usage:

          call mpp_update_nest_coarse(field, nest_domain, field_out, nest_level, complete,
                            position, name, tile_count)

Definition at line 1437 of file mpp_domains.F90.

Public Member Functions

 mpp_update_nest_coarse_i4_2d
 
 mpp_update_nest_coarse_i4_3d
 
 mpp_update_nest_coarse_i4_4d
 
 mpp_update_nest_coarse_i8_2d
 
 mpp_update_nest_coarse_i8_3d
 
 mpp_update_nest_coarse_i8_4d
 
 mpp_update_nest_coarse_r4_2d
 
 mpp_update_nest_coarse_r4_2dv
 
 mpp_update_nest_coarse_r4_3d
 
 mpp_update_nest_coarse_r4_3dv
 
 mpp_update_nest_coarse_r4_4d
 
 mpp_update_nest_coarse_r4_4dv
 
 mpp_update_nest_coarse_r8_2d
 
 mpp_update_nest_coarse_r8_2dv
 
 mpp_update_nest_coarse_r8_3d
 
 mpp_update_nest_coarse_r8_3dv
 
 mpp_update_nest_coarse_r8_4d
 
 mpp_update_nest_coarse_r8_4dv
 

◆ mpp_domains_mod::mpp_update_nest_fine

interface mpp_domains_mod::mpp_update_nest_fine

Pass the data from coarse grid to fill the buffer to be ready to be interpolated onto fine grid.
Example usage:

           call mpp_update_nest_fine(field, nest_domain, wbuffer, ebuffer, sbuffer,
                       nbuffer, nest_level, flags, complete, position, extra_halo, name,
                       tile_count)

Definition at line 1383 of file mpp_domains.F90.

Public Member Functions

 mpp_update_nest_fine_i4_2d
 
 mpp_update_nest_fine_i4_3d
 
 mpp_update_nest_fine_i4_4d
 
 mpp_update_nest_fine_i8_2d
 
 mpp_update_nest_fine_i8_3d
 
 mpp_update_nest_fine_i8_4d
 
 mpp_update_nest_fine_r4_2d
 
 mpp_update_nest_fine_r4_2dv
 
 mpp_update_nest_fine_r4_3d
 
 mpp_update_nest_fine_r4_3dv
 
 mpp_update_nest_fine_r4_4d
 
 mpp_update_nest_fine_r4_4dv
 
 mpp_update_nest_fine_r8_2d
 
 mpp_update_nest_fine_r8_2dv
 
 mpp_update_nest_fine_r8_3d
 
 mpp_update_nest_fine_r8_3dv
 
 mpp_update_nest_fine_r8_4d
 
 mpp_update_nest_fine_r8_4dv
 

◆ mpp_domains_mod::nest_domain_type

type mpp_domains_mod::nest_domain_type

domain with nested fine and course tiles

Definition at line 455 of file mpp_domains.F90.

Collaboration diagram for nest_domain_type:
[legend]

Public Attributes

integer, dimension(:), pointer iend_coarse
 
integer, dimension(:), pointer iend_fine
 
integer, dimension(:), pointer istart_coarse
 
integer, dimension(:), pointer istart_fine
 
integer, dimension(:), pointer jend_coarse
 
integer, dimension(:), pointer jend_fine
 
integer, dimension(:), pointer jstart_coarse
 
integer, dimension(:), pointer jstart_fine
 
character(len=name_length) name
 
type(nest_level_type), dimension(:), pointer nest => NULL()
 
integer, dimension(:), pointer nest_level => NULL()
 Added for moving nest functionality.
 
integer num_level
 
integer num_nest
 
integer, dimension(:), pointer tile_coarse
 
integer, dimension(:), pointer tile_fine
 

Member Data Documentation

◆ iend_coarse

integer, dimension(:), pointer iend_coarse

Definition at line 463 of file mpp_domains.F90.

◆ iend_fine

integer, dimension(:), pointer iend_fine

Definition at line 462 of file mpp_domains.F90.

◆ istart_coarse

integer, dimension(:), pointer istart_coarse

Definition at line 463 of file mpp_domains.F90.

◆ istart_fine

integer, dimension(:), pointer istart_fine

Definition at line 462 of file mpp_domains.F90.

◆ jend_coarse

integer, dimension(:), pointer jend_coarse

Definition at line 463 of file mpp_domains.F90.

◆ jend_fine

integer, dimension(:), pointer jend_fine

Definition at line 462 of file mpp_domains.F90.

◆ jstart_coarse

integer, dimension(:), pointer jstart_coarse

Definition at line 463 of file mpp_domains.F90.

◆ jstart_fine

integer, dimension(:), pointer jstart_fine

Definition at line 462 of file mpp_domains.F90.

◆ name

character(len=name_length) name

Definition at line 456 of file mpp_domains.F90.

◆ nest

type(nest_level_type), dimension(:), pointer nest => NULL()

Definition at line 459 of file mpp_domains.F90.

◆ nest_level

integer, dimension(:), pointer nest_level => NULL()

Added for moving nest functionality.

Definition at line 458 of file mpp_domains.F90.

◆ num_level

integer num_level

Definition at line 457 of file mpp_domains.F90.

◆ num_nest

integer num_nest

Definition at line 460 of file mpp_domains.F90.

◆ tile_coarse

integer, dimension(:), pointer tile_coarse

Definition at line 461 of file mpp_domains.F90.

◆ tile_fine

integer, dimension(:), pointer tile_fine

Definition at line 461 of file mpp_domains.F90.

◆ mpp_domains_mod::nest_level_type

type mpp_domains_mod::nest_level_type

Private type to hold data for each level of nesting.

Definition at line 468 of file mpp_domains.F90.

Collaboration diagram for nest_level_type:
[legend]

Public Attributes

type(nestspec), pointer c2f_c => NULL()
 
type(nestspec), pointer c2f_e => NULL()
 
type(nestspec), pointer c2f_n => NULL()
 
type(nestspec), pointer c2f_t => NULL()
 
type(domain2d), pointer domain_coarse => NULL()
 
type(domain2d), pointer domain_fine => NULL()
 
type(nestspec), pointer f2c_c => NULL()
 
type(nestspec), pointer f2c_e => NULL()
 
type(nestspec), pointer f2c_n => NULL()
 
type(nestspec), pointer f2c_t => NULL()
 
integer, dimension(:), pointer iend_coarse
 
integer, dimension(:), pointer iend_fine
 
logical is_coarse
 
logical is_coarse_pe
 
logical is_fine
 
logical is_fine_pe
 
integer, dimension(:), pointer istart_coarse
 
integer, dimension(:), pointer istart_fine
 
integer, dimension(:), pointer jend_coarse
 
integer, dimension(:), pointer jend_fine
 
integer, dimension(:), pointer jstart_coarse
 
integer, dimension(:), pointer jstart_fine
 
integer, dimension(:), pointer my_nest_id
 
integer my_num_nest
 
integer num_nest
 
integer, dimension(:), pointer pelist => NULL()
 
integer, dimension(:), pointer pelist_coarse => NULL()
 
integer, dimension(:), pointer pelist_fine => NULL()
 
integer, dimension(:), pointer tile_coarse
 
integer, dimension(:), pointer tile_fine
 
integer x_refine
 
integer y_refine
 

Private Attributes

logical on_level
 

Member Data Documentation

◆ c2f_c

type(nestspec), pointer c2f_c => NULL()

Definition at line 484 of file mpp_domains.F90.

◆ c2f_e

type(nestspec), pointer c2f_e => NULL()

Definition at line 485 of file mpp_domains.F90.

◆ c2f_n

type(nestspec), pointer c2f_n => NULL()

Definition at line 486 of file mpp_domains.F90.

◆ c2f_t

type(nestspec), pointer c2f_t => NULL()

Definition at line 483 of file mpp_domains.F90.

◆ domain_coarse

type(domain2d), pointer domain_coarse => NULL()

Definition at line 492 of file mpp_domains.F90.

◆ domain_fine

type(domain2d), pointer domain_fine => NULL()

Definition at line 491 of file mpp_domains.F90.

◆ f2c_c

type(nestspec), pointer f2c_c => NULL()

Definition at line 488 of file mpp_domains.F90.

◆ f2c_e

type(nestspec), pointer f2c_e => NULL()

Definition at line 489 of file mpp_domains.F90.

◆ f2c_n

type(nestspec), pointer f2c_n => NULL()

Definition at line 490 of file mpp_domains.F90.

◆ f2c_t

type(nestspec), pointer f2c_t => NULL()

Definition at line 487 of file mpp_domains.F90.

◆ iend_coarse

integer, dimension(:), pointer iend_coarse

Definition at line 477 of file mpp_domains.F90.

◆ iend_fine

integer, dimension(:), pointer iend_fine

Definition at line 476 of file mpp_domains.F90.

◆ is_coarse

logical is_coarse

Definition at line 471 of file mpp_domains.F90.

◆ is_coarse_pe

logical is_coarse_pe

Definition at line 479 of file mpp_domains.F90.

◆ is_fine

logical is_fine

Definition at line 471 of file mpp_domains.F90.

◆ is_fine_pe

logical is_fine_pe

Definition at line 479 of file mpp_domains.F90.

◆ istart_coarse

integer, dimension(:), pointer istart_coarse

Definition at line 477 of file mpp_domains.F90.

◆ istart_fine

integer, dimension(:), pointer istart_fine

Definition at line 476 of file mpp_domains.F90.

◆ jend_coarse

integer, dimension(:), pointer jend_coarse

Definition at line 477 of file mpp_domains.F90.

◆ jend_fine

integer, dimension(:), pointer jend_fine

Definition at line 476 of file mpp_domains.F90.

◆ jstart_coarse

integer, dimension(:), pointer jstart_coarse

Definition at line 477 of file mpp_domains.F90.

◆ jstart_fine

integer, dimension(:), pointer jstart_fine

Definition at line 476 of file mpp_domains.F90.

◆ my_nest_id

integer, dimension(:), pointer my_nest_id

Definition at line 474 of file mpp_domains.F90.

◆ my_num_nest

integer my_num_nest

Definition at line 473 of file mpp_domains.F90.

◆ num_nest

integer num_nest

Definition at line 472 of file mpp_domains.F90.

◆ on_level

logical on_level
private

Definition at line 470 of file mpp_domains.F90.

◆ pelist

integer, dimension(:), pointer pelist => NULL()

Definition at line 480 of file mpp_domains.F90.

◆ pelist_coarse

integer, dimension(:), pointer pelist_coarse => NULL()

Definition at line 482 of file mpp_domains.F90.

◆ pelist_fine

integer, dimension(:), pointer pelist_fine => NULL()

Definition at line 481 of file mpp_domains.F90.

◆ tile_coarse

integer, dimension(:), pointer tile_coarse

Definition at line 475 of file mpp_domains.F90.

◆ tile_fine

integer, dimension(:), pointer tile_fine

Definition at line 475 of file mpp_domains.F90.

◆ x_refine

integer x_refine

Definition at line 478 of file mpp_domains.F90.

◆ y_refine

integer y_refine

Definition at line 478 of file mpp_domains.F90.

◆ mpp_domains_mod::nestspec

type mpp_domains_mod::nestspec

Used to specify bounds and index information for nested tiles as a linked list.

Definition at line 438 of file mpp_domains.F90.

Collaboration diagram for nestspec:
[legend]

Public Attributes

type(index_typecenter
 
type(index_typeeast
 
integer extra_halo
 
type(nestspec), pointer next => NULL()
 
type(index_typenorth
 
integer nrecv
 
integer nsend
 
type(overlap_type), dimension(:), pointer recv => NULL()
 
type(overlap_type), dimension(:), pointer send => NULL()
 
type(index_typesouth
 
type(index_typewest
 
integer xbegin_c
 
integer xbegin_f
 
integer xend
 
integer xend_c
 
integer xend_f
 
integer xsize_c
 
integer ybegin
 
integer ybegin_c
 
integer ybegin_f
 
integer yend
 
integer yend_c
 
integer yend_f
 
integer ysize_c
 

Private Attributes

integer xbegin
 

Member Data Documentation

◆ center

type(index_type) center

Definition at line 444 of file mpp_domains.F90.

◆ east

type(index_type) east

Definition at line 444 of file mpp_domains.F90.

◆ extra_halo

integer extra_halo

Definition at line 446 of file mpp_domains.F90.

◆ next

type(nestspec), pointer next => NULL()

Definition at line 449 of file mpp_domains.F90.

◆ north

type(index_type) north

Definition at line 444 of file mpp_domains.F90.

◆ nrecv

integer nrecv

Definition at line 445 of file mpp_domains.F90.

◆ nsend

integer nsend

Definition at line 445 of file mpp_domains.F90.

◆ recv

type(overlap_type), dimension(:), pointer recv => NULL()

Definition at line 448 of file mpp_domains.F90.

◆ send

type(overlap_type), dimension(:), pointer send => NULL()

Definition at line 447 of file mpp_domains.F90.

◆ south

type(index_type) south

Definition at line 444 of file mpp_domains.F90.

◆ west

type(index_type) west

Definition at line 444 of file mpp_domains.F90.

◆ xbegin

integer xbegin
private

Definition at line 440 of file mpp_domains.F90.

◆ xbegin_c

integer xbegin_c

Definition at line 441 of file mpp_domains.F90.

◆ xbegin_f

integer xbegin_f

Definition at line 442 of file mpp_domains.F90.

◆ xend

integer xend

Definition at line 440 of file mpp_domains.F90.

◆ xend_c

integer xend_c

Definition at line 441 of file mpp_domains.F90.

◆ xend_f

integer xend_f

Definition at line 442 of file mpp_domains.F90.

◆ xsize_c

integer xsize_c

Definition at line 443 of file mpp_domains.F90.

◆ ybegin

integer ybegin

Definition at line 440 of file mpp_domains.F90.

◆ ybegin_c

integer ybegin_c

Definition at line 441 of file mpp_domains.F90.

◆ ybegin_f

integer ybegin_f

Definition at line 442 of file mpp_domains.F90.

◆ yend

integer yend

Definition at line 440 of file mpp_domains.F90.

◆ yend_c

integer yend_c

Definition at line 441 of file mpp_domains.F90.

◆ yend_f

integer yend_f

Definition at line 442 of file mpp_domains.F90.

◆ ysize_c

integer ysize_c

Definition at line 443 of file mpp_domains.F90.

◆ mpp_domains_mod::nonblock_type

type mpp_domains_mod::nonblock_type

Used for nonblocking data transfer.

Definition at line 548 of file mpp_domains.F90.

Collaboration diagram for nonblock_type:
[legend]

Public Attributes

integer, dimension(max_request) buffer_pos_recv
 
integer, dimension(max_request) buffer_pos_send
 
integer(i8_kind), dimension(max_domain_fields) field_addrs
 
integer(i8_kind), dimension(max_domain_fields) field_addrs2
 
integer nfields
 
integer recv_msgsize
 
integer recv_pos
 
integer, dimension(max_request) request_recv
 
integer request_recv_count
 
integer, dimension(max_request) request_send
 
integer request_send_count
 
integer send_msgsize
 
integer send_pos
 
integer, dimension(max_request) size_recv
 
integer, dimension(max_request) type_recv
 
integer update_ehalo
 
integer update_flags
 
integer update_gridtype
 
integer update_nhalo
 
integer update_position
 
integer update_shalo
 
integer update_whalo
 

Member Data Documentation

◆ buffer_pos_recv

integer, dimension(max_request) buffer_pos_recv

Definition at line 567 of file mpp_domains.F90.

◆ buffer_pos_send

integer, dimension(max_request) buffer_pos_send

Definition at line 566 of file mpp_domains.F90.

◆ field_addrs

integer(i8_kind), dimension(max_domain_fields) field_addrs

Definition at line 568 of file mpp_domains.F90.

◆ field_addrs2

integer(i8_kind), dimension(max_domain_fields) field_addrs2

Definition at line 569 of file mpp_domains.F90.

◆ nfields

integer nfields

Definition at line 570 of file mpp_domains.F90.

◆ recv_msgsize

integer recv_msgsize

Definition at line 551 of file mpp_domains.F90.

◆ recv_pos

integer recv_pos

Definition at line 549 of file mpp_domains.F90.

◆ request_recv

integer, dimension(max_request) request_recv

Definition at line 563 of file mpp_domains.F90.

◆ request_recv_count

integer request_recv_count

Definition at line 561 of file mpp_domains.F90.

◆ request_send

integer, dimension(max_request) request_send

Definition at line 562 of file mpp_domains.F90.

◆ request_send_count

integer request_send_count

Definition at line 560 of file mpp_domains.F90.

◆ send_msgsize

integer send_msgsize

Definition at line 552 of file mpp_domains.F90.

◆ send_pos

integer send_pos

Definition at line 550 of file mpp_domains.F90.

◆ size_recv

integer, dimension(max_request) size_recv

Definition at line 564 of file mpp_domains.F90.

◆ type_recv

integer, dimension(max_request) type_recv

Definition at line 565 of file mpp_domains.F90.

◆ update_ehalo

integer update_ehalo

Definition at line 557 of file mpp_domains.F90.

◆ update_flags

integer update_flags

Definition at line 553 of file mpp_domains.F90.

◆ update_gridtype

integer update_gridtype

Definition at line 555 of file mpp_domains.F90.

◆ update_nhalo

integer update_nhalo

Definition at line 559 of file mpp_domains.F90.

◆ update_position

integer update_position

Definition at line 554 of file mpp_domains.F90.

◆ update_shalo

integer update_shalo

Definition at line 558 of file mpp_domains.F90.

◆ update_whalo

integer update_whalo

Definition at line 556 of file mpp_domains.F90.

◆ mpp_domains_mod::operator(.eq.)

interface mpp_domains_mod::operator(.eq.)

Equality/inequality operators for domaintypes.


The module provides public operators to check for equality/inequality of domaintypes, e.g:

       type(domain1D) :: a, b
       type(domain2D) :: c, d
       ...
       if( a.NE.b )then
       ...
       end if
       if( c==d )then
       ...
       end if


Domains are considered equal if and only if the start and end indices of each of their component global, data and compute domains are equal.

Definition at line 2170 of file mpp_domains.F90.

Public Member Functions

 mpp_domain1d_eq
 
 mpp_domain2d_eq
 
 mpp_domainug_eq
 

◆ mpp_domains_mod::operator(.ne.)

interface mpp_domains_mod::operator(.ne.)

Definition at line 2177 of file mpp_domains.F90.

Public Member Functions

 mpp_domain1d_ne
 
 mpp_domain2d_ne
 
 mpp_domainug_ne
 

◆ mpp_domains_mod::overlap_type

type mpp_domains_mod::overlap_type

Type for overlapping data.

Definition at line 319 of file mpp_domains.F90.

Collaboration diagram for overlap_type:
[legend]

Public Attributes

integer, dimension(:), pointer dir => NULL()
 direction ( value 1,2,3,4 = E,S,W,N)
 
logical, dimension(:), pointer from_contact => NULL()
 indicate if the overlap is computed from define_contact_overlap
 
integer, dimension(:), pointer ie => NULL()
 ending i-index
 
integer, dimension(:), pointer index => NULL()
 for refinement
 
integer, dimension(:), pointer is => NULL()
 starting i-index
 
integer, dimension(:), pointer je => NULL()
 ending j-index
 
integer, dimension(:), pointer js => NULL()
 starting j-index
 
integer, dimension(:), pointer msgsize => NULL()
 overlapping msgsize to be sent or received
 
integer pe
 
integer, dimension(:), pointer rotation => NULL()
 rotation angle.
 
integer start_pos
 start position in the buffer
 
integer, dimension(:), pointer tileme => NULL()
 my tile id for this overlap
 
integer, dimension(:), pointer tilenbr => NULL()
 neighbor tile id for this overlap
 
integer totsize
 all message size
 

Private Attributes

integer count = 0
 number of overlapping
 

Member Data Documentation

◆ count

integer count = 0
private

number of overlapping

Definition at line 321 of file mpp_domains.F90.

◆ dir

integer, dimension(:), pointer dir => NULL()

direction ( value 1,2,3,4 = E,S,W,N)

Definition at line 332 of file mpp_domains.F90.

◆ from_contact

logical, dimension(:), pointer from_contact => NULL()

indicate if the overlap is computed from define_contact_overlap

Definition at line 335 of file mpp_domains.F90.

◆ ie

integer, dimension(:), pointer ie => NULL()

ending i-index

Definition at line 329 of file mpp_domains.F90.

◆ index

integer, dimension(:), pointer index => NULL()

for refinement

Definition at line 334 of file mpp_domains.F90.

◆ is

integer, dimension(:), pointer is => NULL()

starting i-index

Definition at line 328 of file mpp_domains.F90.

◆ je

integer, dimension(:), pointer je => NULL()

ending j-index

Definition at line 331 of file mpp_domains.F90.

◆ js

integer, dimension(:), pointer js => NULL()

starting j-index

Definition at line 330 of file mpp_domains.F90.

◆ msgsize

integer, dimension(:), pointer msgsize => NULL()

overlapping msgsize to be sent or received

Definition at line 325 of file mpp_domains.F90.

◆ pe

integer pe

Definition at line 322 of file mpp_domains.F90.

◆ rotation

integer, dimension(:), pointer rotation => NULL()

rotation angle.

Definition at line 333 of file mpp_domains.F90.

◆ start_pos

integer start_pos

start position in the buffer

Definition at line 323 of file mpp_domains.F90.

◆ tileme

integer, dimension(:), pointer tileme => NULL()

my tile id for this overlap

Definition at line 326 of file mpp_domains.F90.

◆ tilenbr

integer, dimension(:), pointer tilenbr => NULL()

neighbor tile id for this overlap

Definition at line 327 of file mpp_domains.F90.

◆ totsize

integer totsize

all message size

Definition at line 324 of file mpp_domains.F90.

◆ mpp_domains_mod::overlapspec

type mpp_domains_mod::overlapspec

Private type for overlap specifications.

Definition at line 341 of file mpp_domains.F90.

Collaboration diagram for overlapspec:
[legend]

Public Attributes

integer ehalo
 
type(overlapspec), pointer next => NULL()
 
integer nhalo
 halo size
 
integer nrecv
 
integer nsend
 
type(overlap_type), dimension(:), pointer recv => NULL()
 
integer recvsize
 
type(overlap_type), dimension(:), pointer send => NULL()
 
integer sendsize
 
integer shalo
 
integer xbegin
 
integer xend
 
integer ybegin
 
integer yend
 

Private Attributes

integer whalo
 

Member Data Documentation

◆ ehalo

integer ehalo

Definition at line 343 of file mpp_domains.F90.

◆ next

type(overlapspec), pointer next => NULL()

Definition at line 349 of file mpp_domains.F90.

◆ nhalo

integer nhalo

halo size

Definition at line 343 of file mpp_domains.F90.

◆ nrecv

integer nrecv

Definition at line 345 of file mpp_domains.F90.

◆ nsend

integer nsend

Definition at line 345 of file mpp_domains.F90.

◆ recv

type(overlap_type), dimension(:), pointer recv => NULL()

Definition at line 348 of file mpp_domains.F90.

◆ recvsize

integer recvsize

Definition at line 346 of file mpp_domains.F90.

◆ send

type(overlap_type), dimension(:), pointer send => NULL()

Definition at line 347 of file mpp_domains.F90.

◆ sendsize

integer sendsize

Definition at line 346 of file mpp_domains.F90.

◆ shalo

integer shalo

Definition at line 343 of file mpp_domains.F90.

◆ whalo

integer whalo
private

Definition at line 343 of file mpp_domains.F90.

◆ xbegin

integer xbegin

Definition at line 344 of file mpp_domains.F90.

◆ xend

integer xend

Definition at line 344 of file mpp_domains.F90.

◆ ybegin

integer ybegin

Definition at line 344 of file mpp_domains.F90.

◆ yend

integer yend

Definition at line 344 of file mpp_domains.F90.

◆ mpp_domains_mod::tile_type

type mpp_domains_mod::tile_type

Upper and lower x and y bounds for a tile.

Definition at line 354 of file mpp_domains.F90.

Collaboration diagram for tile_type:
[legend]

Public Attributes

integer xbegin
 
integer xend
 
integer ybegin
 
integer yend
 

Member Data Documentation

◆ xbegin

integer xbegin

Definition at line 355 of file mpp_domains.F90.

◆ xend

integer xend

Definition at line 355 of file mpp_domains.F90.

◆ ybegin

integer ybegin

Definition at line 355 of file mpp_domains.F90.

◆ yend

integer yend

Definition at line 355 of file mpp_domains.F90.

◆ mpp_domains_mod::unstruct_axis_spec

type mpp_domains_mod::unstruct_axis_spec

Private type for axis specification data for an unstructured grid.

Definition at line 229 of file mpp_domains.F90.

Collaboration diagram for unstruct_axis_spec:
[legend]

Public Attributes

integer begin_index
 
integer end
 
integer end_index
 
integer max_size
 
integer size
 

Private Attributes

integer begin
 

Member Data Documentation

◆ begin

integer begin
private

Definition at line 231 of file mpp_domains.F90.

◆ begin_index

integer begin_index

Definition at line 232 of file mpp_domains.F90.

◆ end

integer end

Definition at line 231 of file mpp_domains.F90.

◆ end_index

integer end_index

Definition at line 232 of file mpp_domains.F90.

◆ max_size

integer max_size

Definition at line 231 of file mpp_domains.F90.

◆ size

integer size

Definition at line 231 of file mpp_domains.F90.

◆ mpp_domains_mod::unstruct_domain_spec

type mpp_domains_mod::unstruct_domain_spec

Private type for axis specification data for an unstructured domain.

Definition at line 237 of file mpp_domains.F90.

Collaboration diagram for unstruct_domain_spec:
[legend]

Public Attributes

integer pe
 
integer pos
 
integer tile_id
 

Private Attributes

type(unstruct_axis_speccompute
 

Member Data Documentation

◆ compute

type(unstruct_axis_spec) compute
private

Definition at line 239 of file mpp_domains.F90.

◆ pe

integer pe

Definition at line 240 of file mpp_domains.F90.

◆ pos

integer pos

Definition at line 241 of file mpp_domains.F90.

◆ tile_id

integer tile_id

Definition at line 242 of file mpp_domains.F90.

◆ mpp_domains_mod::unstruct_overlap_type

type mpp_domains_mod::unstruct_overlap_type

Private type.

Definition at line 247 of file mpp_domains.F90.

Collaboration diagram for unstruct_overlap_type:
[legend]

Public Attributes

integer, dimension(:), pointer i =>NULL()
 
integer, dimension(:), pointer j =>NULL()
 
integer pe
 

Private Attributes

integer count = 0
 

Member Data Documentation

◆ count

integer count = 0
private

Definition at line 249 of file mpp_domains.F90.

◆ i

integer, dimension(:), pointer i =>NULL()

Definition at line 251 of file mpp_domains.F90.

◆ j

integer, dimension(:), pointer j =>NULL()

Definition at line 252 of file mpp_domains.F90.

◆ pe

integer pe

Definition at line 250 of file mpp_domains.F90.

◆ mpp_domains_mod::unstruct_pass_type

type mpp_domains_mod::unstruct_pass_type

Private type.

Definition at line 257 of file mpp_domains.F90.

Collaboration diagram for unstruct_pass_type:
[legend]

Public Attributes

integer nrecv
 
type(unstruct_overlap_type), dimension(:), pointer recv =>NULL()
 
type(unstruct_overlap_type), dimension(:), pointer send =>NULL()
 

Private Attributes

integer nsend
 

Member Data Documentation

◆ nrecv

integer nrecv

Definition at line 259 of file mpp_domains.F90.

◆ nsend

integer nsend
private

Definition at line 259 of file mpp_domains.F90.

◆ recv

type(unstruct_overlap_type), dimension(:), pointer recv =>NULL()

Definition at line 260 of file mpp_domains.F90.

◆ send

type(unstruct_overlap_type), dimension(:), pointer send =>NULL()

Definition at line 261 of file mpp_domains.F90.

Function/Subroutine Documentation

◆ add_check_overlap()

subroutine add_check_overlap ( type(overlap_type), intent(inout)  overlap_out,
type(overlap_type), intent(in)  overlap_in 
)

this routine adds the overlap_in into overlap_out

Definition at line 7818 of file mpp_domains_define.inc.

◆ add_update_overlap()

subroutine add_update_overlap ( type(overlap_type), intent(inout)  overlap_out,
type(overlap_type), intent(in)  overlap_in 
)

Definition at line 8037 of file mpp_domains_define.inc.

◆ allocate_check_overlap()

subroutine allocate_check_overlap ( type(overlap_type), intent(inout)  overlap,
integer, intent(in)  count 
)

Definition at line 7768 of file mpp_domains_define.inc.

◆ allocate_nest_overlap()

subroutine allocate_nest_overlap ( type(overlap_type), intent(inout)  overlap,
integer, intent(in)  count 
)

Definition at line 1443 of file mpp_define_nest_domains.inc.

◆ allocate_update_overlap()

subroutine allocate_update_overlap ( type(overlap_type), intent(inout)  overlap,
integer, intent(in)  count 
)

Definition at line 7880 of file mpp_domains_define.inc.

◆ apply_cyclic_offset()

subroutine apply_cyclic_offset ( integer, intent(inout)  lstart,
integer, intent(inout)  lend,
integer, intent(in)  offset,
integer, intent(in)  gstart,
integer, intent(in)  gend,
integer, intent(in)  gsize 
)

add offset to the index

Definition at line 4973 of file mpp_domains_define.inc.

◆ check_alignment()

subroutine check_alignment ( integer, intent(inout)  is,
integer, intent(inout)  ie,
integer, intent(inout)  js,
integer, intent(inout)  je,
integer, intent(inout)  isg,
integer, intent(inout)  ieg,
integer, intent(inout)  jsg,
integer, intent(inout)  jeg,
integer, intent(out)  alignment 
)

Definition at line 7503 of file mpp_domains_define.inc.

◆ check_data_size_1d()

subroutine check_data_size_1d ( character(len=*), intent(in)  module,
character(len=*), intent(in)  str1,
integer, intent(in)  size1,
character(len=*), intent(in)  str2,
integer, intent(in)  size2 
)

Definition at line 2444 of file mpp_define_nest_domains.inc.

◆ check_data_size_2d()

subroutine check_data_size_2d ( character(len=*), intent(in)  module,
character(len=*), intent(in)  str1,
integer, intent(in)  isize1,
integer, intent(in)  jsize1,
character(len=*), intent(in)  str2,
integer, intent(in)  isize2,
integer, intent(in)  jsize2 
)

Definition at line 2457 of file mpp_define_nest_domains.inc.

◆ check_message_size()

subroutine check_message_size ( type(domain2d), intent(in)  domain,
type(overlapspec), intent(in)  update,
logical, dimension(:), intent(in)  send,
logical, dimension(:), intent(in)  recv,
character, intent(in)  position 
)

Definition at line 1127 of file mpp_domains_define.inc.

◆ check_overlap_pe_order()

subroutine check_overlap_pe_order ( type(domain2d), intent(in)  domain,
type(overlapspec), intent(in)  overlap,
character(len=*), intent(in)  name 
)

Definition at line 8153 of file mpp_domains_define.inc.

◆ compute_overlap_coarse_to_fine()

subroutine compute_overlap_coarse_to_fine ( type(nest_level_type), intent(inout)  nest_domain,
type(nestspec), intent(inout)  overlap,
integer, intent(in)  extra_halo,
integer, intent(in)  position,
character(len=*), intent(in)  name 
)

Definition at line 604 of file mpp_define_nest_domains.inc.

◆ compute_overlap_fine_to_coarse()

subroutine compute_overlap_fine_to_coarse ( type(nest_level_type), intent(inout)  nest_domain,
type(nestspec), intent(inout)  overlap,
integer, intent(in)  position,
character(len=*), intent(in)  name 
)

This routine will compute the send and recv information between overlapped nesting region. The data is assumed on T-cell center.

Definition at line 1110 of file mpp_define_nest_domains.inc.

◆ compute_overlaps()

subroutine compute_overlaps ( type(domain2d), intent(inout)  domain,
integer, intent(in)  position,
type(overlapspec), intent(inout), pointer  update,
type(overlapspec), intent(inout), pointer  check,
integer, intent(in)  ishift,
integer, intent(in)  jshift,
integer, intent(in)  x_cyclic_offset,
integer, intent(in)  y_cyclic_offset,
integer, intent(in)  whalo,
integer, intent(in)  ehalo,
integer, intent(in)  shalo,
integer, intent(in)  nhalo 
)

Computes remote domain overlaps.

Assumes only one in each direction will calculate the overlapping for T,E,C,N-cell seperately.

Definition at line 1594 of file mpp_domains_define.inc.

◆ compute_overlaps_fold_east()

subroutine compute_overlaps_fold_east ( type(domain2d), intent(inout)  domain,
integer, intent(in)  position,
integer, intent(in)  ishift,
integer, intent(in)  jshift 
)

computes remote domain overlaps assumes only one in each direction will calculate the overlapping for T,E,C,N-cell seperately. here assume fold-east and y-cyclic boundary condition

Definition at line 4280 of file mpp_domains_define.inc.

◆ compute_overlaps_fold_south()

subroutine compute_overlaps_fold_south ( type(domain2d), intent(inout)  domain,
integer, intent(in)  position,
integer, intent(in)  ishift,
integer, intent(in)  jshift 
)

Computes remote domain overlaps assumes only one in each direction will calculate the overlapping for T,E,C,N-cell seperately.

Definition at line 3011 of file mpp_domains_define.inc.

◆ compute_overlaps_fold_west()

subroutine compute_overlaps_fold_west ( type(domain2d), intent(inout)  domain,
integer, intent(in)  position,
integer, intent(in)  ishift,
integer, intent(in)  jshift 
)

Computes remote domain overlaps assumes only one in each direction will calculate the overlapping for T,E,C,N-cell seperately.

Definition at line 3656 of file mpp_domains_define.inc.

◆ convert_index_back()

subroutine convert_index_back ( type(domain2d), intent(in)  domain,
integer, intent(in)  ishift,
integer, intent(in)  jshift,
integer, intent(in)  rotate,
integer, intent(in)  is_in,
integer, intent(in)  ie_in,
integer, intent(in)  js_in,
integer, intent(in)  je_in,
integer, intent(out)  is_out,
integer, intent(out)  ie_out,
integer, intent(out)  js_out,
integer, intent(out)  je_out 
)

Definition at line 2221 of file mpp_define_nest_domains.inc.

◆ convert_index_to_coarse()

integer function convert_index_to_coarse ( type(domain2d), intent(in)  domain,
integer, intent(in)  ishift,
integer, intent(in)  jshift,
integer, intent(in)  tile_coarse,
integer, intent(in)  istart_coarse,
integer, intent(in)  iend_coarse,
integer, intent(in)  jstart_coarse,
integer, intent(in)  jend_coarse,
integer, intent(in)  ntiles_coarse,
integer, intent(in)  tile_in,
integer, intent(in)  is_in,
integer, intent(in)  ie_in,
integer, intent(in)  js_in,
integer, intent(in)  je_in,
integer, dimension(:), intent(out)  is_out,
integer, dimension(:), intent(out)  ie_out,
integer, dimension(:), intent(out)  js_out,
integer, dimension(:), intent(out)  je_out,
integer, dimension(:), intent(out)  rotate_out 
)

Definition at line 2079 of file mpp_define_nest_domains.inc.

◆ convert_index_to_nest()

integer function convert_index_to_nest ( type(domain2d), intent(in)  domain,
integer, intent(in)  ishift,
integer, intent(in)  jshift,
integer, intent(in)  tile_coarse,
integer, intent(in)  istart_coarse,
integer, intent(in)  iend_coarse,
integer, intent(in)  jstart_coarse,
integer, intent(in)  jend_coarse,
integer, intent(in)  ntiles_coarse,
integer, intent(in)  tile_in,
integer, intent(in)  is_in,
integer, intent(in)  ie_in,
integer, intent(in)  js_in,
integer, intent(in)  je_in,
integer, dimension(:), intent(out)  is_out,
integer, dimension(:), intent(out)  ie_out,
integer, dimension(:), intent(out)  js_out,
integer, dimension(:), intent(out)  je_out,
integer, dimension(:), intent(out)  rotate_out 
)

This routine will convert the global coarse grid index to nest grid index.

Definition at line 1938 of file mpp_define_nest_domains.inc.

◆ copy_nest_overlap()

subroutine copy_nest_overlap ( type(overlap_type), intent(inout)  overlap_out,
type(overlap_type), intent(in)  overlap_in 
)

Definition at line 1521 of file mpp_define_nest_domains.inc.

◆ deallocate_comm()

subroutine deallocate_comm ( type(domaincommunicator2d), intent(inout)  d_comm)

Definition at line 662 of file mpp_domains_comm.inc.

◆ deallocate_domain2d_local()

subroutine deallocate_domain2d_local ( type(domain2d), intent(inout)  domain)

Definition at line 7690 of file mpp_domains_define.inc.

◆ deallocate_nest_overlap()

subroutine deallocate_nest_overlap ( type(overlap_type), intent(inout)  overlap)

Definition at line 1463 of file mpp_define_nest_domains.inc.

◆ deallocate_overlap_type()

subroutine deallocate_overlap_type ( type(overlap_type), intent(inout)  overlap)

Definition at line 7990 of file mpp_domains_define.inc.

◆ deallocate_overlapspec()

subroutine deallocate_overlapspec ( type(overlapspec), intent(inout)  overlap)

Definition at line 8015 of file mpp_domains_define.inc.

◆ debug_message_size()

subroutine debug_message_size ( type(nestspec), intent(in)  overlap,
character(len=*), intent(in)  name 
)

Definition at line 1374 of file mpp_define_nest_domains.inc.

◆ define_contact_point()

subroutine define_contact_point ( type(domain2d), intent(inout)  domain,
integer, intent(in)  position,
integer, intent(in)  num_contact,
integer, dimension(:), intent(in)  tile1,
integer, dimension(:), intent(in)  tile2,
integer, dimension(:), intent(in)  align1,
integer, dimension(:), intent(in)  align2,
real, dimension(:), intent(in)  refine1,
real, dimension(:), intent(in)  refine2,
integer, dimension(:), intent(in)  istart1,
integer, dimension(:), intent(in)  iend1,
integer, dimension(:), intent(in)  jstart1,
integer, dimension(:), intent(in)  jend1,
integer, dimension(:), intent(in)  istart2,
integer, dimension(:), intent(in)  iend2,
integer, dimension(:), intent(in)  jstart2,
integer, dimension(:), intent(in)  jend2,
integer, dimension(:), intent(in)  isglist,
integer, dimension(:), intent(in)  ieglist,
integer, dimension(:), intent(in)  jsglist,
integer, dimension(:), intent(in)  jeglist 
)

compute the overlapping between tiles for the T-cell.

Parameters
[in]num_contactnumber of contact regions
[in]tile2tile number
[in]align2align direction of contact region
[in]refine2refinement between tiles
[in]iend1i-index in tile_1 of contact region
[in]jend1j-index in tile_1 of contact region
[in]iend2i-index in tile_2 of contact region
[in]jend2j-index in tile_2 of contact region
[in]ieglisti-global domain of each tile
[in]jeglistj-global domain of each tile

Definition at line 5291 of file mpp_domains_define.inc.

◆ define_nest_level_type()

subroutine define_nest_level_type ( type(nest_level_type), intent(inout)  nest_domain,
integer, intent(in)  x_refine,
integer, intent(in)  y_refine,
integer, intent(in)  extra_halo 
)
Parameters
[in,out]nest_domainnest domain to be defined
[in]extra_halohalo value
[in]y_refinex and y refinements

Definition at line 465 of file mpp_define_nest_domains.inc.

◆ domain_update_is_needed()

logical function domain_update_is_needed ( type(domain2d), intent(in)  domain,
integer, intent(in)  whalo,
integer, intent(in)  ehalo,
integer, intent(in)  shalo,
integer, intent(in)  nhalo 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 1002 of file mpp_domains_util.inc.

◆ expand_check_overlap_list()

subroutine expand_check_overlap_list ( type(overlap_type), dimension(:), pointer  overlaplist,
integer, intent(in)  npes 
)

Definition at line 8128 of file mpp_domains_define.inc.

◆ expand_update_overlap_list()

subroutine expand_update_overlap_list ( type(overlap_type), dimension(:), pointer  overlaplist,
integer, intent(in)  npes 
)

Definition at line 8103 of file mpp_domains_define.inc.

◆ fill_contact()

subroutine fill_contact ( type(contact_type), intent(inout)  contact,
integer, intent(in)  tile,
integer, intent(in)  is1,
integer, intent(in)  ie1,
integer, intent(in)  js1,
integer, intent(in)  je1,
integer, intent(in)  is2,
integer, intent(in)  ie2,
integer, intent(in)  js2,
integer, intent(in)  je2,
integer, intent(in)  align1,
integer, intent(in)  align2,
real, intent(in)  refine1,
real, intent(in)  refine2 
)

always fill the contact according to index order.

Definition at line 5923 of file mpp_domains_define.inc.

◆ fill_corner_contact()

subroutine fill_corner_contact ( type(contact_type), dimension(:), intent(in)  econt,
type(contact_type), dimension(:), intent(in)  scont,
type(contact_type), dimension(:), intent(in)  wcont,
type(contact_type), dimension(:), intent(in)  ncont,
integer, dimension(:), intent(in)  isg,
integer, dimension(:), intent(in)  ieg,
integer, dimension(:), intent(in)  jsg,
integer, dimension(:), intent(in)  jeg,
integer, intent(inout)  numr,
integer, intent(inout)  nums,
integer, dimension(:), intent(inout)  tilerecv,
integer, dimension(:), intent(inout)  tilesend,
integer, dimension(:), intent(inout)  is1recv,
integer, dimension(:), intent(inout)  ie1recv,
integer, dimension(:), intent(inout)  js1recv,
integer, dimension(:), intent(inout)  je1recv,
integer, dimension(:), intent(inout)  is2recv,
integer, dimension(:), intent(inout)  ie2recv,
integer, dimension(:), intent(inout)  js2recv,
integer, dimension(:), intent(inout)  je2recv,
integer, dimension(:), intent(inout)  is1send,
integer, dimension(:), intent(inout)  ie1send,
integer, dimension(:), intent(inout)  js1send,
integer, dimension(:), intent(inout)  je1send,
integer, dimension(:), intent(inout)  is2send,
integer, dimension(:), intent(inout)  ie2send,
integer, dimension(:), intent(inout)  js2send,
integer, dimension(:), intent(inout)  je2send,
integer, dimension(:), intent(inout)  align1recv,
integer, dimension(:), intent(inout)  align2recv,
integer, dimension(:), intent(inout)  align1send,
integer, dimension(:), intent(inout)  align2send,
integer, intent(in)  whalo,
integer, intent(in)  ehalo,
integer, intent(in)  shalo,
integer, intent(in)  nhalo,
integer, intent(in)  tileme 
)

Definition at line 7007 of file mpp_domains_define.inc.

◆ fill_overlap()

subroutine fill_overlap ( type(overlap_type), intent(inout)  overlap,
type(domain2d), intent(inout)  domain,
integer, intent(in)  m,
integer, intent(in)  is,
integer, intent(in)  ie,
integer, intent(in)  js,
integer, intent(in)  je,
integer, intent(in)  isc,
integer, intent(in)  iec,
integer, intent(in)  jsc,
integer, intent(in)  jec,
integer, intent(in)  isg,
integer, intent(in)  ieg,
integer, intent(in)  jsg,
integer, intent(in)  jeg,
integer, intent(in)  dir,
logical, intent(in), optional  reverse,
logical, intent(in), optional  symmetry 
)

Definition at line 2977 of file mpp_domains_define.inc.

◆ fill_overlap_recv_fold()

subroutine fill_overlap_recv_fold ( type(overlap_type), intent(inout)  overlap,
type(domain2d), intent(inout)  domain,
integer, intent(in)  m,
integer, intent(in)  is,
integer, intent(in)  ie,
integer, intent(in)  js,
integer, intent(in)  je,
integer, intent(in)  isd,
integer, intent(in)  ied,
integer, intent(in)  jsd,
integer, intent(in)  jed,
integer, intent(in)  isg,
integer, intent(in)  ieg,
integer, intent(in)  dir,
integer, intent(in)  ishift,
integer, intent(in)  position,
integer, intent(in)  ioff,
integer, intent(in)  middle,
logical, intent(in), optional  symmetry 
)

Definition at line 2905 of file mpp_domains_define.inc.

◆ fill_overlap_recv_nofold()

subroutine fill_overlap_recv_nofold ( type(overlap_type), intent(inout)  overlap,
type(domain2d), intent(inout)  domain,
integer, intent(in)  m,
integer, intent(in)  is,
integer, intent(in)  ie,
integer, intent(in)  js,
integer, intent(in)  je,
integer, intent(in)  isd,
integer, intent(in)  ied,
integer, intent(in)  jsd,
integer, intent(in)  jed,
integer, intent(in)  isg,
integer, intent(in)  ieg,
integer, intent(in)  dir,
integer, intent(in)  ioff,
logical, intent(in)  is_cyclic,
logical, intent(in), optional  folded,
logical, intent(in), optional  symmetry 
)

Definition at line 2869 of file mpp_domains_define.inc.

◆ fill_overlap_send_fold()

subroutine fill_overlap_send_fold ( type(overlap_type), intent(inout)  overlap,
type(domain2d), intent(inout)  domain,
integer, intent(in)  m,
integer, intent(in)  is,
integer, intent(in)  ie,
integer, intent(in)  js,
integer, intent(in)  je,
integer, intent(in)  isc,
integer, intent(in)  iec,
integer, intent(in)  jsc,
integer, intent(in)  jec,
integer, intent(in)  isg,
integer, intent(in)  ieg,
integer, intent(in)  dir,
integer, intent(in)  ishift,
integer, intent(in)  position,
integer, intent(in)  ioff,
integer, intent(in)  middle,
logical, intent(in), optional  symmetry 
)

Definition at line 2808 of file mpp_domains_define.inc.

◆ fill_overlap_send_nofold()

subroutine fill_overlap_send_nofold ( type(overlap_type), intent(inout)  overlap,
type(domain2d), intent(inout)  domain,
integer, intent(in)  m,
integer, intent(in)  is,
integer, intent(in)  ie,
integer, intent(in)  js,
integer, intent(in)  je,
integer, intent(in)  isc,
integer, intent(in)  iec,
integer, intent(in)  jsc,
integer, intent(in)  jec,
integer, intent(in)  isg,
integer, intent(in)  ieg,
integer, intent(in)  dir,
integer, intent(in)  ioff,
logical, intent(in)  is_cyclic,
logical, intent(in), optional  folded,
logical, intent(in), optional  symmetry 
)

Definition at line 2784 of file mpp_domains_define.inc.

◆ find_index()

integer function find_index ( integer, dimension(:), intent(in)  array,
integer, intent(in)  index_data,
integer, intent(in)  start_pos 
)

Definition at line 1353 of file mpp_define_nest_domains.inc.

◆ find_key()

integer function find_key ( integer(i8_kind), intent(in)  key,
integer(i8_kind), dimension(:), intent(in)  sorted,
integer, intent(out)  insert 
)

Definition at line 612 of file mpp_domains_comm.inc.

◆ free_comm()

subroutine free_comm ( integer(i8_kind), intent(in)  domain_id,
integer(i8_kind), intent(in)  l_addr,
integer(i8_kind), intent(in), optional  l_addr2 
)

Definition at line 473 of file mpp_domains_comm.inc.

◆ get_coarse_index()

subroutine get_coarse_index ( integer, intent(in)  rotate,
integer, intent(in)  is,
integer, intent(in)  ie,
integer, intent(in)  js,
integer, intent(in)  je,
integer, intent(in)  iadd,
integer, intent(in)  jadd,
integer, intent(out)  is_c,
integer, intent(out)  ie_c,
integer, intent(out)  js_c,
integer, intent(out)  je_c 
)

Definition at line 1809 of file mpp_define_nest_domains.inc.

◆ get_comm()

type(domaincommunicator2d) function, pointer get_comm ( integer(i8_kind), intent(in)  domain_id,
integer(i8_kind), intent(in)  l_addr,
integer(i8_kind), intent(in), optional  l_addr2 
)

Definition at line 505 of file mpp_domains_comm.inc.

◆ get_fold_index_east()

subroutine get_fold_index_east ( integer, intent(in)  jsg,
integer, intent(in)  jeg,
integer, intent(in)  ieg,
integer, intent(in)  jshift,
integer, intent(in)  position,
integer, intent(inout)  is,
integer, intent(inout)  ie,
integer, intent(inout)  js,
integer, intent(inout)  je 
)

Definition at line 4903 of file mpp_domains_define.inc.

◆ get_fold_index_north()

subroutine get_fold_index_north ( integer, intent(in)  isg,
integer, intent(in)  ieg,
integer, intent(in)  jeg,
integer, intent(in)  ishift,
integer, intent(in)  position,
integer, intent(inout)  is,
integer, intent(inout)  ie,
integer, intent(inout)  js,
integer, intent(inout)  je 
)

Definition at line 4948 of file mpp_domains_define.inc.

◆ get_fold_index_south()

subroutine get_fold_index_south ( integer, intent(in)  isg,
integer, intent(in)  ieg,
integer, intent(in)  jsg,
integer, intent(in)  ishift,
integer, intent(in)  position,
integer, intent(inout)  is,
integer, intent(inout)  ie,
integer, intent(inout)  js,
integer, intent(inout)  je 
)

Definition at line 4926 of file mpp_domains_define.inc.

◆ get_fold_index_west()

subroutine get_fold_index_west ( integer, intent(in)  jsg,
integer, intent(in)  jeg,
integer, intent(in)  isg,
integer, intent(in)  jshift,
integer, intent(in)  position,
integer, intent(inout)  is,
integer, intent(inout)  ie,
integer, intent(inout)  js,
integer, intent(inout)  je 
)

Definition at line 4880 of file mpp_domains_define.inc.

◆ get_mesgsize()

integer function get_mesgsize ( type(overlap_type), intent(in)  overlap,
logical, dimension(:), intent(in)  do_dir 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 1715 of file mpp_domains_util.inc.

◆ get_nest_vector_recv()

integer function get_nest_vector_recv ( type(nest_level_type), intent(in)  nest_domain,
type(nestspec), intent(in)  update_x,
type(nestspec), intent(in)  update_y,
integer, dimension(:), intent(out)  ind_x,
integer, dimension(:), intent(out)  ind_y,
integer, dimension(:), intent(out)  start_pos,
integer, dimension(:), intent(out)  pelist 
)

Definition at line 2277 of file mpp_define_nest_domains.inc.

◆ get_nest_vector_send()

integer function get_nest_vector_send ( type(nest_level_type), intent(in)  nest_domain,
type(nestspec), intent(in)  update_x,
type(nestspec), intent(in)  update_y,
integer, dimension(:), intent(out)  ind_x,
integer, dimension(:), intent(out)  ind_y,
integer, dimension(:), intent(out)  start_pos,
integer, dimension(:), intent(out)  pelist 
)

Definition at line 2361 of file mpp_define_nest_domains.inc.

◆ get_nnest()

subroutine get_nnest ( type(domain2d), intent(in)  domain,
integer, intent(in)  num_nest,
integer, dimension(:), intent(in)  tile_coarse,
integer, dimension(:), intent(in)  istart_coarse,
integer, dimension(:), intent(in)  iend_coarse,
integer, dimension(:), intent(in)  jstart_coarse,
integer, dimension(:), intent(in)  jend_coarse,
integer, intent(in)  x_refine,
integer, intent(in)  y_refine,
integer, intent(out)  nnest,
integer, dimension(:), intent(out)  t_coarse,
integer, dimension(:), intent(out)  ncross_coarse,
integer, dimension(:), intent(out)  rotate_coarse,
integer, dimension(:), intent(out)  is_coarse,
integer, dimension(:), intent(out)  ie_coarse,
integer, dimension(:), intent(out)  js_coarse,
integer, dimension(:), intent(out)  je_coarse,
integer, dimension(:), intent(out)  is_fine,
integer, dimension(:), intent(out)  ie_fine,
integer, dimension(:), intent(out)  js_fine,
integer, dimension(:), intent(out)  je_fine 
)

Definition at line 1826 of file mpp_define_nest_domains.inc.

◆ get_rank_recv()

integer function get_rank_recv ( type(domain2d), intent(in)  domain,
type(overlapspec), intent(in)  overlap_x,
type(overlapspec), intent(in)  overlap_y,
integer, intent(out)  rank_x,
integer, intent(out)  rank_y,
integer, intent(out)  ind_x,
integer, intent(out)  ind_y 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 1523 of file mpp_domains_util.inc.

◆ get_rank_send()

integer function get_rank_send ( type(domain2d), intent(in)  domain,
type(overlapspec), intent(in)  overlap_x,
type(overlapspec), intent(in)  overlap_y,
integer, intent(out)  rank_x,
integer, intent(out)  rank_y,
integer, intent(out)  ind_x,
integer, intent(out)  ind_y 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 1496 of file mpp_domains_util.inc.

◆ get_rank_unpack()

integer function get_rank_unpack ( type(domain2d), intent(in)  domain,
type(overlapspec), intent(in)  overlap_x,
type(overlapspec), intent(in)  overlap_y,
integer, intent(out)  rank_x,
integer, intent(out)  rank_y,
integer, intent(out)  ind_x,
integer, intent(out)  ind_y 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 1687 of file mpp_domains_util.inc.

◆ get_vector_recv()

integer function get_vector_recv ( type(domain2d), intent(in)  domain,
type(overlapspec), intent(in)  update_x,
type(overlapspec), intent(in)  update_y,
integer, dimension(:), intent(out)  ind_x,
integer, dimension(:), intent(out)  ind_y,
integer, dimension(:), intent(out)  start_pos,
integer, dimension(:), intent(out)  pelist 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 1553 of file mpp_domains_util.inc.

◆ get_vector_send()

integer function get_vector_send ( type(domain2d), intent(in)  domain,
type(overlapspec), intent(in)  update_x,
type(overlapspec), intent(in)  update_y,
integer, dimension(:), intent(out)  ind_x,
integer, dimension(:), intent(out)  ind_y,
integer, dimension(:), intent(out)  start_pos,
integer, dimension(:), intent(out)  pelist 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 1619 of file mpp_domains_util.inc.

◆ init_index_type()

subroutine init_index_type ( type(index_type), intent(inout)  indexdata)

Definition at line 1429 of file mpp_define_nest_domains.inc.

◆ init_overlap_type()

subroutine init_overlap_type ( type(overlap_type), intent(inout)  overlap)

Definition at line 7870 of file mpp_domains_define.inc.

◆ insert_check_overlap()

subroutine insert_check_overlap ( type(overlap_type), intent(inout)  overlap,
integer, intent(in)  pe,
integer, intent(in)  tileme,
integer, intent(in)  dir,
integer, intent(in)  rotation,
integer, intent(in)  is,
integer, intent(in)  ie,
integer, intent(in)  js,
integer, intent(in)  je 
)

Definition at line 7787 of file mpp_domains_define.inc.

◆ insert_nest_overlap()

subroutine insert_nest_overlap ( type(overlap_type), intent(inout)  overlap,
integer, intent(in)  pe,
integer, intent(in)  is,
integer, intent(in)  ie,
integer, intent(in)  js,
integer, intent(in)  je,
integer, intent(in)  dir,
integer, intent(in)  rotation 
)

Definition at line 1479 of file mpp_define_nest_domains.inc.

◆ insert_overlap_type()

subroutine insert_overlap_type ( type(overlap_type), intent(inout)  overlap,
integer, intent(in)  pe,
integer, intent(in)  tileme,
integer, intent(in)  tilenbr,
integer, intent(in)  is,
integer, intent(in)  ie,
integer, intent(in)  js,
integer, intent(in)  je,
integer, intent(in)  dir,
integer, intent(in)  rotation,
logical, intent(in)  from_contact 
)

Definition at line 7956 of file mpp_domains_define.inc.

◆ insert_update_overlap()

subroutine insert_update_overlap ( type(overlap_type), intent(inout)  overlap,
integer, intent(in)  pe,
integer, intent(in)  is1,
integer, intent(in)  ie1,
integer, intent(in)  js1,
integer, intent(in)  je1,
integer, intent(in)  is2,
integer, intent(in)  ie2,
integer, intent(in)  js2,
integer, intent(in)  je2,
integer, intent(in)  dir,
logical, intent(in), optional  reverse,
logical, intent(in), optional  symmetry 
)

Definition at line 7901 of file mpp_domains_define.inc.

◆ mpp_clear_group_update()

subroutine mpp_clear_group_update ( type(mpp_group_update_type), intent(inout)  group)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 2374 of file mpp_domains_util.inc.

◆ mpp_compute_block_extent()

subroutine mpp_compute_block_extent ( integer, intent(in)  isg,
integer, intent(in)  ieg,
integer, intent(in)  ndivs,
integer, dimension(:), intent(out)  ibegin,
integer, dimension(:), intent(out)  iend 
)

Computes the extents of a grid block.

Tis implementation is different from mpp_compute_extents The last block might have most points

Definition at line 161 of file mpp_domains_define.inc.

◆ mpp_compute_extent()

subroutine mpp_compute_extent ( integer, intent(in)  isg,
integer, intent(in)  ieg,
integer, intent(in)  ndivs,
integer, dimension(0:), intent(out)  ibegin,
integer, dimension(0:), intent(out)  iend,
integer, dimension(0:), intent(in), optional  extent 
)

Computes extents for a grid decomposition with the given indices and divisions.

Definition at line 187 of file mpp_domains_define.inc.

◆ mpp_copy_domain1d()

recursive subroutine mpp_copy_domain1d ( type(domain1d), intent(in)  domain_in,
type(domain1d), intent(inout)  domain_out 
)

Copies input 1d domain to the output 1d domain.

Parameters
[in]domain_inInput domain
[in,out]domain_outOutput domain

Definition at line 1741 of file mpp_domains_util.inc.

◆ mpp_copy_domain1d_spec()

subroutine mpp_copy_domain1d_spec ( type(domain1d_spec), intent(in)  domain1d_spec_in,
type(domain1d_spec), intent(out)  domain1d_spec_out 
)

Copies input 1d domain spec to the output 1d domain spec.

Parameters
[in]domain1d_spec_inInput
[out]domain1d_spec_outOutput

Definition at line 1896 of file mpp_domains_util.inc.

◆ mpp_copy_domain2d()

subroutine mpp_copy_domain2d ( type(domain2d), intent(in)  domain_in,
type(domain2d), intent(inout)  domain_out 
)

Copies input 2d domain to the output 2d domain.

Parameters
[in]domain_inInput domain
[in,out]domain_outOutput domain

Definition at line 1773 of file mpp_domains_util.inc.

◆ mpp_copy_domain2d_spec()

subroutine mpp_copy_domain2d_spec ( type(domain2d_spec), intent(in)  domain2d_spec_in,
type(domain2d_spec), intent(out)  domain2d_spec_out 
)

Copies input 2d domain spec to the output 2d domain spec.

Parameters
[in]domain2d_spec_inInput
[out]domain2d_spec_outOutput

Definition at line 1850 of file mpp_domains_util.inc.

◆ mpp_copy_domain_axis_spec()

subroutine mpp_copy_domain_axis_spec ( type(domain_axis_spec), intent(in)  domain_axis_spec_in,
type(domain_axis_spec), intent(out)  domain_axis_spec_out 
)

Copies input domain_axis_spec to the output domain_axis_spec.

Parameters
[in]domain_axis_spec_inInput
[out]domain_axis_spec_outOutput

Definition at line 1907 of file mpp_domains_util.inc.

◆ mpp_create_super_grid_domain()

subroutine mpp_create_super_grid_domain ( type(domain2d), intent(inout)  domain)

Modifies the indices of the input domain to create the supergrid domain.

This is an example of how to use mpp_create_super_grid_domain

call mpp_copy_domain(domain_in, domain_out)
call super_grid_domain(domain_out)

domain_in is the original domain, domain_out is the domain with the supergrid indices.

Parameters
[in,out]domainInput domain

Definition at line 293 of file mpp_domains_util.inc.

◆ mpp_deallocate_domain1d()

subroutine mpp_deallocate_domain1d ( type(domain1d), intent(inout)  domain)

Definition at line 7668 of file mpp_domains_define.inc.

◆ mpp_deallocate_domain2d()

subroutine mpp_deallocate_domain2d ( type(domain2d), intent(inout)  domain)

Definition at line 7677 of file mpp_domains_define.inc.

◆ mpp_define_domains1d()

subroutine mpp_define_domains1d ( integer, dimension(:), intent(in)  global_indices,
integer, intent(in)  ndivs,
type(domain1d), intent(inout)  domain,
integer, dimension(0:), intent(in), optional  pelist,
integer, intent(in), optional  flags,
integer, intent(in), optional  halo,
integer, dimension(0:), intent(in), optional  extent,
logical, dimension(0:), intent(in), optional  maskmap,
integer, intent(in), optional  memory_size,
integer, intent(in), optional  begin_halo,
integer, intent(in), optional  end_halo 
)

Define data and computational domains on a 1D set of data (isg:ieg) and assign them to PEs.

Parameters
[in]global_indices(/ isg, ieg /) gives the extent of global domain
[in]ndivsnumber of divisions of domain: even divisions unless extent is present.
[in,out]domainthe returned domain1D; declared inout so that existing links, if any, can be nullified
[in]pelistlist of PEs to which domains are to be assigned (default 0...npes-1); size of pelist must correspond to number of mask=.TRUE. divisions
[in]haloflags define whether compute and data domains are global (undecomposed) and whether the global domain has periodic boundaries. halo defines halo width (currently the same on both sides)
[in]extentarray extent; defines width of each division (used for non-uniform domain decomp, for e.g load-balancing)
[in]maskmapa division whose maskmap=.FALSE. is not assigned to any domain. By default we assume decomposition of compute and data domains, non-periodic boundaries, no halo, as close to uniform extents as the input parameters permit

Definition at line 281 of file mpp_domains_define.inc.

◆ mpp_define_domains2d()

subroutine mpp_define_domains2d ( integer, dimension(:), intent(in)  global_indices,
integer, dimension(:), intent(in)  layout,
type(domain2d), intent(inout)  domain,
integer, dimension(0:), intent(in), optional  pelist,
integer, intent(in), optional  xflags,
integer, intent(in), optional  yflags,
integer, intent(in), optional  xhalo,
integer, intent(in), optional  yhalo,
integer, dimension(0:), intent(in), optional  xextent,
integer, dimension(0:), intent(in), optional  yextent,
logical, dimension(0:,0:), intent(in), optional  maskmap,
character(len=*), intent(in), optional  name,
logical, intent(in), optional  symmetry,
integer, dimension(:), intent(in), optional  memory_size,
integer, intent(in), optional  whalo,
integer, intent(in), optional  ehalo,
integer, intent(in), optional  shalo,
integer, intent(in), optional  nhalo,
logical, intent(in), optional  is_mosaic,
integer, intent(in), optional  tile_count,
integer, intent(in), optional  tile_id,
logical, intent(in), optional  complete,
integer, intent(in), optional  x_cyclic_offset,
integer, intent(in), optional  y_cyclic_offset 
)

Define 2D data and computational domain on global rectilinear cartesian domain (isg:ieg,jsg:jeg) and assign them to PEs.

Parameters
[in]global_indices(/ isg, ieg, jsg, jeg /)
[in]layoutpe layout
[in,out]domain2D domain decomposition to define
[in]pelistcurrent pelist to run on
[in]yflagsdirectional flag
[in]yhalohalo sizes for x and y indices
[in]is_mosaicindicate if calling mpp_define_domains from mpp_define_mosaic.
[in]nhalohalo size for West, East, South and North direction. if whalo and ehalo is not present, will take the value of xhalo if shalo and nhalo is not present, will take the value of yhalo
[in]tile_counttile number on current pe, default value is 1. this is for the situation that multiple tiles on one processor
[in]tile_idtile id
[in]completetrue indicate mpp_define_domain is completed for mosaic definition.
[in]x_cyclic_offsetoffset for x-cyclic boundary condition, (0,j) = (ni, mod(j+x_cyclic_offset,nj)) (ni+1, j)=(1 ,mod(j+nj-x_cyclic_offset,nj))
[in]y_cyclic_offsetoffset for y-cyclic boundary condition (i,0) = (mod(i+y_cyclic_offset,ni), nj)) (i,nj+1) =(mod(mod(i+ni-y_cyclic_offset,ni), 1) )

Definition at line 608 of file mpp_domains_define.inc.

◆ mpp_define_io_domain()

subroutine mpp_define_io_domain ( type(domain2d), intent(inout)  domain,
integer, dimension(2), intent(in)  io_layout 
)

Define the layout for IO pe's for the given domain.

Parameters
[in,out]domainInput 2D domain
[in]io_layout2 value io pe layout to define

Definition at line 457 of file mpp_domains_define.inc.

◆ mpp_define_layout2d()

subroutine mpp_define_layout2d ( integer, dimension(:), intent(in)  global_indices,
integer, intent(in)  ndivs,
integer, dimension(:), intent(out)  layout 
)
Parameters
[in]global_indices(/ isg, ieg, jsg, jeg /); Defines the global domain.
[in]ndivsnumber of divisions to divide global domain

Definition at line 27 of file mpp_domains_define.inc.

◆ mpp_define_mosaic()

subroutine mpp_define_mosaic ( integer, dimension(:,:), intent(in)  global_indices,
integer, dimension(:,:), intent(in)  layout,
type(domain2d), intent(inout)  domain,
integer, intent(in)  num_tile,
integer, intent(in)  num_contact,
integer, dimension(:), intent(in)  tile1,
integer, dimension(:), intent(in)  tile2,
integer, dimension(:), intent(in)  istart1,
integer, dimension(:), intent(in)  iend1,
integer, dimension(:), intent(in)  jstart1,
integer, dimension(:), intent(in)  jend1,
integer, dimension(:), intent(in)  istart2,
integer, dimension(:), intent(in)  iend2,
integer, dimension(:), intent(in)  jstart2,
integer, dimension(:), intent(in)  jend2,
integer, dimension(:), intent(in)  pe_start,
integer, dimension(:), intent(in)  pe_end,
integer, dimension(:), intent(in), optional  pelist,
integer, intent(in), optional  whalo,
integer, intent(in), optional  ehalo,
integer, intent(in), optional  shalo,
integer, intent(in), optional  nhalo,
integer, dimension(:,:), intent(in), optional  xextent,
integer, dimension(:,:), intent(in), optional  yextent,
logical, dimension(:,:,:), intent(in), optional  maskmap,
character(len=*), intent(in), optional  name,
integer, dimension(2), intent(in), optional  memory_size,
logical, intent(in), optional  symmetry,
integer, intent(in), optional  xflags,
integer, intent(in), optional  yflags,
integer, dimension(:), intent(in), optional  tile_id 
)

Defines a domain for mosaic tile grids.

Parameters
[in]num_tilenumber of tiles in the mosaic
[in]num_contactnumber of contact region between tiles.
[in]tile2tile number
[in]iend1i-index in tile_1 of contact region
[in]jend1j-index in tile_1 of contact region
[in]iend2i-index in tile_2 of contact region
[in]jend2j-index in tile_2 of contact region
[in]pe_startstart pe of the pelist used in each tile
[in]pe_endend pe of the pelist used in each tile
[in]pelistlist of processors used in mosaic
[in]tile_idtile_id of each tile in the mosaic

Definition at line 1201 of file mpp_domains_define.inc.

◆ mpp_define_mosaic_pelist()

subroutine mpp_define_mosaic_pelist ( integer, dimension(:), intent(in)  sizes,
integer, dimension(:), intent(inout)  pe_start,
integer, dimension(:), intent(inout)  pe_end,
integer, dimension(:), intent(in), optional  pelist,
integer, dimension(:), intent(in), optional  costpertile 
)

Defines a pelist for use with mosaic tiles.

Note
The following routine may need to revised to improve the capability. It is very hard to make it balance for all the situation. Hopefully some smart idea will come up someday.

Definition at line 62 of file mpp_domains_define.inc.

◆ mpp_define_nest_domains()

subroutine mpp_define_nest_domains ( type(nest_domain_type), intent(inout)  nest_domain,
type(domain2d), intent(in), target  domain,
integer, intent(in)  num_nest,
integer, dimension(:), intent(in)  nest_level,
integer, dimension(:), intent(in)  tile_fine,
integer, dimension(:), intent(in)  tile_coarse,
integer, dimension(:), intent(in)  istart_coarse,
integer, dimension(:), intent(in)  icount_coarse,
integer, dimension(:), intent(in)  jstart_coarse,
integer, dimension(:), intent(in)  jcount_coarse,
integer, dimension(:), intent(in)  npes_nest_tile,
integer, dimension(:), intent(in)  x_refine,
integer, dimension(:), intent(in)  y_refine,
integer, intent(in), optional  extra_halo,
character(len=*), intent(in), optional  name 
)

Set up a domain to pass data between aligned coarse and fine grid of nested model.

Set up a domain to pass data between aligned coarse and fine grid of a nested model. Supports multiple and telescoping nests. A telescoping nest is defined as a nest within a nest. Nest domains may span multiple tiles, but cannot contain a coarse-grid, cube corner. Concurrent nesting is the only supported mechanism, i.e. coarse and fine grid are on individual, non-overlapping, processor lists. Coarse and fine grid domain need to be defined before calling mpp_define_nest_domains. An mpp_broadcast is needed to broadcast both fine and coarse grid domain onto all processors.

mpp_update_nest_coarse is used to pass data from fine grid to coarse grid computing domain. mpp_update_nest_fine is used to pass data from coarse grid to fine grid halo. You may call mpp_get_C2F_index before calling mpp_update_nest_fine to get the index for passing data from coarse to fine. You may call mpp_get_F2C_index before calling mpp_update_nest_coarse to get the index for passing data from coarse to fine.

Note
The following tests for nesting of regular lat-lon grids upon a cubed-sphere grid are done in test_mpp_domains:
a) a first-level nest spanning multiple cubed-sphere faces (tiles 1, 2, & 4)
b) a first-level nest wholly contained within tile 3
c) a second-level nest contained within the nest mentioned in a)
Tests are done for data at T, E, C, N-cell center.

Below is an example to pass data between fine and coarse grid (More details on how to use the nesting domain update are available in routine test_update_nest_domain of test_fms/mpp/test_mpp_domains.F90.

if( concurrent ) then
call mpp_broadcast_domain(domain_fine)
call mpp_broadcast_domain(domain_coarse)
endif
call mpp_define_nest_domains(nest_domain,domain,num_nest,nest_level(1:num_nest), &
tile_fine(1:num_nest), tile_coarse(1:num_nest), &
istart_coarse(1:num_nest), icount_coarse(1:num_nest), &
jstart_coarse(1:num_nest), jcount_coarse(1:num_nest), &
npes_nest_tile, x_refine(1:num_nest), y_refine(1:num_nest), &
extra_halo=extra_halo, name="nest_domain")
call mpp_get_c2f_index(nest_domain, isw_f, iew_f, jsw_f, jew_f, isw_c, iew_c, jsw_c, jew_c, west, level)
call mpp_get_c2f_index(nest_domain, ise_f, iee_f, jse_f, jee_f, ise_c, iee_c, jse_c, jee_c, east, level)
call mpp_get_c2f_index(nest_domain, iss_f, ies_f, jss_f, jes_f, iss_c, ies_c, jss_c, jes_c, south, level)
call mpp_get_c2f_index(nest_domain, isn_f, ien_f, jsn_f, jen_f, isn_c, ien_c, jsn_c, jen_c, north, level)
allocate(wbuffer(isw_c:iew_c, jsw_c:jew_c,nz))
allocate(ebuffer(ise_c:iee_c, jse_c:jee_c,nz))
allocate(sbuffer(iss_c:ies_c, jss_c:jes_c,nz))
allocate(nbuffer(isn_c:ien_c, jsn_c:jen_c,nz))
call mpp_update_nest_fine(x, nest_domain, wbuffer, sbuffer, ebuffer, nbuffer)
call mpp_get_f2c_index(nest_domain, is_c, ie_c, js_c, je_c, is_f, ie_f, js_f, je_f, nest_level=level)
allocate(buffer(is_f:ie_f, js_f:je_f,nz))
call mpp_update_nest_coarse(x, nest_domain, buffer)
@note currently the contact will be limited to overlap contact.
@param [in,out] nest_domain holds the information to pass data
between nest and parent grids.
@param [in] domain domain for the grid defined in the current pelist
@param [in] num_nest number of nests
@param [in] nest_level array containing the nest level for each nest
(>1 implies a telescoping nest)
@param [in] tile_coarse array containing tile number of the
nest grid(monotonically increasing starting with 7),
array containing tile number of the parent grid corresponding
to the lower left corner of a given nest
@param [in] jcount_coarse start
array containing index in the parent grid of the lower left corner of a given
nest, count: array containing span of the nest on the parent grid
@param [in] npes_nest_tile array containing number of pes to allocated
to each defined tile
@param [in] y_refine array containing refinement ratio
for each nest
@param [in] extra_halo extra halo for passing data from coarse grid to fine grid.
default is 0 and currently only support extra_halo = 0.
@param [in] name name of the nest domain
subroutine mpp_define_nest_domains(nest_domain, domain, num_nest, nest_level, tile_fine, tile_coarse, istart_coarse, icount_coarse, jstart_coarse, jcount_coarse, npes_nest_tile, x_refine, y_refine, extra_halo, name)
Set up a domain to pass data between aligned coarse and fine grid of nested model.
subroutine mpp_get_c2f_index(nest_domain, is_fine, ie_fine, js_fine, je_fine, is_coarse, ie_coarse, js_coarse, je_coarse, dir, nest_level, position)
Get the index of the data passed from coarse grid to fine grid.

Definition at line 95 of file mpp_define_nest_domains.inc.

◆ mpp_define_null_domain1d()

subroutine mpp_define_null_domain1d ( type(domain1d), intent(inout)  domain)

Definition at line 7639 of file mpp_domains_define.inc.

◆ mpp_define_null_domain2d()

subroutine mpp_define_null_domain2d ( type(domain2d), intent(inout)  domain)

Definition at line 7652 of file mpp_domains_define.inc.

◆ mpp_domain1d_eq()

logical function mpp_domain1d_eq ( type(domain1d), intent(in)  a,
type(domain1d), intent(in)  b 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 60 of file mpp_domains_util.inc.

◆ mpp_domain1d_ne()

logical function mpp_domain1d_ne ( type(domain1d), intent(in)  a,
type(domain1d), intent(in)  b 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 78 of file mpp_domains_util.inc.

◆ mpp_domain2d_eq()

logical function mpp_domain2d_eq ( type(domain2d), intent(in)  a,
type(domain2d), intent(in)  b 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 86 of file mpp_domains_util.inc.

◆ mpp_domain2d_ne()

logical function mpp_domain2d_ne ( type(domain2d), intent(in)  a,
type(domain2d), intent(in)  b 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 110 of file mpp_domains_util.inc.

◆ mpp_domain_is_initialized()

logical function mpp_domain_is_initialized ( type(domain2d), intent(in)  domain)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 988 of file mpp_domains_util.inc.

◆ mpp_domain_is_symmetry()

logical function mpp_domain_is_symmetry ( type(domain2d), intent(in)  domain)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 978 of file mpp_domains_util.inc.

◆ mpp_domain_is_tile_root_pe()

logical function mpp_domain_is_tile_root_pe ( type(domain2d), intent(in)  domain)

Returns if current pe is the root pe of the tile, if number of tiles on current pe is greater than 1, will return true, if isc==isg and jsc==jsg also will return true, otherwise false will be returned.

Definition at line 1182 of file mpp_domains_util.inc.

◆ mpp_domains_set_stack_size()

subroutine mpp_domains_set_stack_size ( integer, intent(in)  n)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 35 of file mpp_domains_util.inc.

◆ mpp_get_c2f_index()

subroutine mpp_get_c2f_index ( type(nest_domain_type), intent(in)  nest_domain,
integer, intent(out)  is_fine,
integer, intent(out)  ie_fine,
integer, intent(out)  js_fine,
integer, intent(out)  je_fine,
integer, intent(out)  is_coarse,
integer, intent(out)  ie_coarse,
integer, intent(out)  js_coarse,
integer, intent(out)  je_coarse,
integer, intent(in)  dir,
integer, intent(in)  nest_level,
integer, intent(in), optional  position 
)

Get the index of the data passed from coarse grid to fine grid.

Get the index of the data passed from coarse grid to fine grid.


Example usage:

call mpp_get_c2f_index(nest_domain, is_fine, ie_fine, js_fine, je_fine,
is_coarse, ie_coarse, js_coarse, je_coarse, dir,
nest_level, position)
Parameters
[in]nest_domainholds the information to pass data between fine and coarse grids
[out]je_fineindex in the fine grid of the nested region
[out]je_coarseindex in the coarse grid of the nested region
[in]nest_leveldirection of the halo update. Its value should be WEST, EAST, SOUTH or NORTH.; level of the nest (> 1 implies a telescoping nest)
[in]positionCell position. It value should be CENTER, EAST, CORNER, or NORTH.

Definition at line 1638 of file mpp_define_nest_domains.inc.

◆ mpp_get_compute_domain1d()

subroutine mpp_get_compute_domain1d ( type(domain1d), intent(in)  domain,
integer, intent(out), optional  begin,
integer, intent(out), optional  end,
integer, intent(out), optional  size,
integer, intent(out), optional  max_size,
logical, intent(out), optional  is_global 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 124 of file mpp_domains_util.inc.

◆ mpp_get_compute_domain2d()

subroutine mpp_get_compute_domain2d ( type(domain2d), intent(in)  domain,
integer, intent(out), optional  xbegin,
integer, intent(out), optional  xend,
integer, intent(out), optional  ybegin,
integer, intent(out), optional  yend,
integer, intent(out), optional  xsize,
integer, intent(out), optional  xmax_size,
integer, intent(out), optional  ysize,
integer, intent(out), optional  ymax_size,
logical, intent(out), optional  x_is_global,
logical, intent(out), optional  y_is_global,
integer, intent(in), optional  tile_count,
integer, intent(in), optional  position 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 178 of file mpp_domains_util.inc.

◆ mpp_get_compute_domains1d()

subroutine mpp_get_compute_domains1d ( type(domain1d), intent(in)  domain,
integer, dimension(:), intent(out), optional  begin,
integer, dimension(:), intent(out), optional  end,
integer, dimension(:), intent(out), optional  size 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 445 of file mpp_domains_util.inc.

◆ mpp_get_compute_domains2d()

subroutine mpp_get_compute_domains2d ( type(domain2d), intent(in)  domain,
integer, dimension(:), intent(out), optional  xbegin,
integer, dimension(:), intent(out), optional  xend,
integer, dimension(:), intent(out), optional  xsize,
integer, dimension(:), intent(out), optional  ybegin,
integer, dimension(:), intent(out), optional  yend,
integer, dimension(:), intent(out), optional  ysize,
integer, intent(in), optional  position 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 471 of file mpp_domains_util.inc.

◆ mpp_get_current_ntile()

integer function mpp_get_current_ntile ( type(domain2d), intent(in)  domain)

Returns number of tile on current pe.

Definition at line 1169 of file mpp_domains_util.inc.

◆ mpp_get_data_domain1d()

subroutine mpp_get_data_domain1d ( type(domain1d), intent(in)  domain,
integer, intent(out), optional  begin,
integer, intent(out), optional  end,
integer, intent(out), optional  size,
integer, intent(out), optional  max_size,
logical, intent(out), optional  is_global 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 138 of file mpp_domains_util.inc.

◆ mpp_get_data_domain2d()

subroutine mpp_get_data_domain2d ( type(domain2d), intent(in)  domain,
integer, intent(out), optional  xbegin,
integer, intent(out), optional  xend,
integer, intent(out), optional  ybegin,
integer, intent(out), optional  yend,
integer, intent(out), optional  xsize,
integer, intent(out), optional  xmax_size,
integer, intent(out), optional  ysize,
integer, intent(out), optional  ymax_size,
logical, intent(out), optional  x_is_global,
logical, intent(out), optional  y_is_global,
integer, intent(in), optional  tile_count,
integer, intent(in), optional  position 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 203 of file mpp_domains_util.inc.

◆ mpp_get_domain_commid()

integer function mpp_get_domain_commid ( integer, intent(in)  domain)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 702 of file mpp_domains_util.inc.

◆ mpp_get_domain_components()

subroutine mpp_get_domain_components ( type(domain2d), intent(in)  domain,
type(domain1d), intent(inout), optional  x,
type(domain1d), intent(inout), optional  y,
integer, intent(in), optional  tile_count 
)

Retrieve 1D components of 2D decomposition.

It is sometime necessary to have direct recourse to the domain1D types that compose a domain2D object. This call retrieves them.

call mpp_get_domain_components( domain, x, y )
subroutine mpp_get_domain_components(domain, x, y, tile_count)
Retrieve 1D components of 2D decomposition.

Definition at line 431 of file mpp_domains_util.inc.

◆ mpp_get_domain_extents1d()

subroutine mpp_get_domain_extents1d ( type(domain2d), intent(in)  domain,
integer, dimension(0:), intent(inout)  xextent,
integer, dimension(0:), intent(inout)  yextent 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 617 of file mpp_domains_util.inc.

◆ mpp_get_domain_extents2d()

subroutine mpp_get_domain_extents2d ( type(domain2d), intent(in)  domain,
integer, dimension(:,:), intent(inout)  xextent,
integer, dimension(:,:), intent(inout)  yextent 
)

This will return xextent and yextent for each tile.

Definition at line 641 of file mpp_domains_util.inc.

◆ mpp_get_domain_name()

character(len=name_length) function mpp_get_domain_name ( type(domain2d), intent(in)  domain)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 1441 of file mpp_domains_util.inc.

◆ mpp_get_domain_npes()

integer function mpp_get_domain_npes ( type(domain2d), intent(in)  domain)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 1458 of file mpp_domains_util.inc.

◆ mpp_get_domain_pe()

integer function mpp_get_domain_pe ( type(domain2d), intent(in)  domain)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 674 of file mpp_domains_util.inc.

◆ mpp_get_domain_pelist()

subroutine mpp_get_domain_pelist ( type(domain2d), intent(in)  domain,
integer, dimension(:), intent(out)  pelist 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 1469 of file mpp_domains_util.inc.

◆ mpp_get_domain_root_pe()

integer function mpp_get_domain_root_pe ( type(domain2d), intent(in)  domain)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 1450 of file mpp_domains_util.inc.

◆ mpp_get_domain_shift()

subroutine mpp_get_domain_shift ( type(domain2d), intent(in)  domain,
integer, intent(out)  ishift,
integer, intent(out)  jshift,
integer, intent(in), optional  position 
)

Returns the shift value in x and y-direction according to domain position..

When domain is symmetry, one extra point maybe needed in x- and/or y-direction. This routine will return the shift value based on the position

call mpp_get_domain_shift( domain, ishift, jshift, position )
subroutine mpp_get_domain_shift(domain, ishift, jshift, position)
Returns the shift value in x and y-direction according to domain position..
Parameters
[out]jshiftreturn value will be 0 or 1.
[in]positionposition of data. Its value can be CENTER, EAST, NORTH or CORNER.

Definition at line 811 of file mpp_domains_util.inc.

◆ mpp_get_domain_tile_commid()

integer function mpp_get_domain_tile_commid ( type(domain2d), intent(in)  domain)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 693 of file mpp_domains_util.inc.

◆ mpp_get_domain_tile_root_pe()

integer function mpp_get_domain_tile_root_pe ( type(domain2d), intent(in)  domain)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 684 of file mpp_domains_util.inc.

◆ mpp_get_f2c_index_coarse()

subroutine mpp_get_f2c_index_coarse ( type(nest_domain_type), intent(in)  nest_domain,
integer, intent(out)  is_coarse,
integer, intent(out)  ie_coarse,
integer, intent(out)  js_coarse,
integer, intent(out)  je_coarse,
integer, intent(in)  nest_level,
integer, intent(in), optional  position 
)
Parameters
[in]nest_domainHolds the information to pass data between fine and coarse grid.
[out]je_coarseindex in the fine grid of the nested region
[in]nest_levellevel of the nest (> 1 implies a telescoping nest)
[in]positionCell position. It value should be CENTER, EAST, CORNER, or NORTH.

Definition at line 1768 of file mpp_define_nest_domains.inc.

◆ mpp_get_f2c_index_fine()

subroutine mpp_get_f2c_index_fine ( type(nest_domain_type), intent(in)  nest_domain,
integer, intent(out)  is_coarse,
integer, intent(out)  ie_coarse,
integer, intent(out)  js_coarse,
integer, intent(out)  je_coarse,
integer, intent(out)  is_fine,
integer, intent(out)  ie_fine,
integer, intent(out)  js_fine,
integer, intent(out)  je_fine,
integer, intent(in)  nest_level,
integer, intent(in), optional  position 
)
Parameters
[in]nest_domainHolds the information to pass data between fine and coarse grid.
[out]je_fineindex in the fine grid of the nested region
[out]je_coarseindex in the coarse grid of the nested region
[in]nest_levellevel of the nest (> 1 implies a telescoping nest)
[in]positionCell position. It value should be CENTER, EAST, CORNER, or NORTH.

Definition at line 1719 of file mpp_define_nest_domains.inc.

◆ mpp_get_global_domain1d()

subroutine mpp_get_global_domain1d ( type(domain1d), intent(in)  domain,
integer, intent(out), optional  begin,
integer, intent(out), optional  end,
integer, intent(out), optional  size,
integer, intent(out), optional  max_size 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 152 of file mpp_domains_util.inc.

◆ mpp_get_global_domain2d()

subroutine mpp_get_global_domain2d ( type(domain2d), intent(in)  domain,
integer, intent(out), optional  xbegin,
integer, intent(out), optional  xend,
integer, intent(out), optional  ybegin,
integer, intent(out), optional  yend,
integer, intent(out), optional  xsize,
integer, intent(out), optional  xmax_size,
integer, intent(out), optional  ysize,
integer, intent(out), optional  ymax_size,
integer, intent(in), optional  tile_count,
integer, intent(in), optional  position 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 228 of file mpp_domains_util.inc.

◆ mpp_get_global_domains1d()

subroutine mpp_get_global_domains1d ( type(domain1d), intent(in)  domain,
integer, dimension(:), intent(out), optional  begin,
integer, dimension(:), intent(out), optional  end,
integer, dimension(:), intent(out), optional  size 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 530 of file mpp_domains_util.inc.

◆ mpp_get_global_domains2d()

subroutine mpp_get_global_domains2d ( type(domain2d), intent(in)  domain,
integer, dimension(:), intent(out), optional  xbegin,
integer, dimension(:), intent(out), optional  xend,
integer, dimension(:), intent(out), optional  xsize,
integer, dimension(:), intent(out), optional  ybegin,
integer, dimension(:), intent(out), optional  yend,
integer, dimension(:), intent(out), optional  ysize,
integer, intent(in), optional  position 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 557 of file mpp_domains_util.inc.

◆ mpp_get_io_domain()

type(domain2d) function, pointer mpp_get_io_domain ( type(domain2d), intent(in)  domain)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 711 of file mpp_domains_util.inc.

◆ mpp_get_io_domain_layout()

integer function, dimension(2) mpp_get_io_domain_layout ( type(domain2d), intent(in)  domain)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 1487 of file mpp_domains_util.inc.

◆ mpp_get_layout1d()

subroutine mpp_get_layout1d ( type(domain1d), intent(in)  domain,
integer, intent(out)  layout 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 773 of file mpp_domains_util.inc.

◆ mpp_get_layout2d()

subroutine mpp_get_layout2d ( type(domain2d), intent(in)  domain,
integer, dimension(2), intent(out)  layout 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 789 of file mpp_domains_util.inc.

◆ mpp_get_memory_domain1d()

subroutine mpp_get_memory_domain1d ( type(domain1d), intent(in)  domain,
integer, intent(out), optional  begin,
integer, intent(out), optional  end,
integer, intent(out), optional  size,
integer, intent(out), optional  max_size,
logical, intent(out), optional  is_global 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 164 of file mpp_domains_util.inc.

◆ mpp_get_memory_domain2d()

subroutine mpp_get_memory_domain2d ( type(domain2d), intent(in)  domain,
integer, intent(out), optional  xbegin,
integer, intent(out), optional  xend,
integer, intent(out), optional  ybegin,
integer, intent(out), optional  yend,
integer, intent(out), optional  xsize,
integer, intent(out), optional  xmax_size,
integer, intent(out), optional  ysize,
integer, intent(out), optional  ymax_size,
logical, intent(out), optional  x_is_global,
logical, intent(out), optional  y_is_global,
integer, intent(in), optional  position 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 252 of file mpp_domains_util.inc.

◆ mpp_get_neighbor_pe_1d()

subroutine mpp_get_neighbor_pe_1d ( type(domain1d), intent(inout)  domain,
integer, intent(in)  direction,
integer, intent(out)  pe 
)

Return PE to the righ/left of this PE-domain.

Definition at line 837 of file mpp_domains_util.inc.

◆ mpp_get_neighbor_pe_2d()

subroutine mpp_get_neighbor_pe_2d ( type(domain2d), intent(inout)  domain,
integer, intent(in)  direction,
integer, intent(out)  pe 
)

Return PE North/South/East/West of this PE-domain. direction must be NORTH, SOUTH, EAST or WEST.

Definition at line 885 of file mpp_domains_util.inc.

◆ mpp_get_nest_coarse_domain()

type(domain2d) function, pointer mpp_get_nest_coarse_domain ( type(nest_domain_type), intent(in)  nest_domain,
integer, intent(in)  nest_level 
)

Definition at line 2469 of file mpp_define_nest_domains.inc.

◆ mpp_get_nest_fine_domain()

type(domain2d) function, pointer mpp_get_nest_fine_domain ( type(nest_domain_type), intent(in)  nest_domain,
integer, intent(in)  nest_level 
)

Definition at line 2484 of file mpp_define_nest_domains.inc.

◆ mpp_get_nest_fine_npes()

integer function mpp_get_nest_fine_npes ( type(nest_domain_type), intent(in)  nest_domain,
integer, intent(in)  nest_level 
)

Definition at line 2525 of file mpp_define_nest_domains.inc.

◆ mpp_get_nest_fine_pelist()

subroutine mpp_get_nest_fine_pelist ( type(nest_domain_type), intent(in)  nest_domain,
integer, intent(in)  nest_level,
integer, dimension(:), intent(out)  pelist 
)

Definition at line 2537 of file mpp_define_nest_domains.inc.

◆ mpp_get_nest_npes()

integer function mpp_get_nest_npes ( type(nest_domain_type), intent(in)  nest_domain,
integer, intent(in)  nest_level 
)

Definition at line 2499 of file mpp_define_nest_domains.inc.

◆ mpp_get_nest_pelist()

subroutine mpp_get_nest_pelist ( type(nest_domain_type), intent(in)  nest_domain,
integer, intent(in)  nest_level,
integer, dimension(:), intent(out)  pelist 
)

Definition at line 2511 of file mpp_define_nest_domains.inc.

◆ mpp_get_ntile_count()

integer function mpp_get_ntile_count ( type(domain2d), intent(in)  domain)

Returns number of tiles in mosaic.

Definition at line 1158 of file mpp_domains_util.inc.

◆ mpp_get_num_overlap()

integer function mpp_get_num_overlap ( type(domain2d), intent(in)  domain,
integer, intent(in)  action,
integer, intent(in)  p,
integer, intent(in), optional  position 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 1279 of file mpp_domains_util.inc.

◆ mpp_get_overlap()

subroutine mpp_get_overlap ( type(domain2d), intent(in)  domain,
integer, intent(in)  action,
integer, intent(in)  p,
integer, dimension(:), intent(out)  is,
integer, dimension(:), intent(out)  ie,
integer, dimension(:), intent(out)  js,
integer, dimension(:), intent(out)  je,
integer, dimension(:), intent(out)  dir,
integer, dimension(:), intent(out)  rot,
integer, intent(in), optional  position 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 1388 of file mpp_domains_util.inc.

◆ mpp_get_pelist1d()

subroutine mpp_get_pelist1d ( type(domain1d), intent(in)  domain,
integer, dimension(:), intent(out)  pelist,
integer, intent(out), optional  pos 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 729 of file mpp_domains_util.inc.

◆ mpp_get_pelist2d()

subroutine mpp_get_pelist2d ( type(domain2d), intent(in)  domain,
integer, dimension(:), intent(out)  pelist,
integer, intent(out), optional  pos 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 753 of file mpp_domains_util.inc.

◆ mpp_get_tile_compute_domains()

subroutine mpp_get_tile_compute_domains ( type(domain2d), intent(in)  domain,
integer, dimension(:), intent(out)  xbegin,
integer, dimension(:), intent(out)  xend,
integer, dimension(:), intent(out)  ybegin,
integer, dimension(:), intent(out)  yend,
integer, intent(in), optional  position 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 1236 of file mpp_domains_util.inc.

◆ mpp_get_tile_id()

integer function, dimension(size(domain%tile_id(:))) mpp_get_tile_id ( type(domain2d), intent(in)  domain)

Returns the tile_id on current pe.

Definition at line 1130 of file mpp_domains_util.inc.

◆ mpp_get_tile_list()

subroutine mpp_get_tile_list ( type(domain2d), intent(in)  domain,
integer, dimension(:), intent(inout)  tiles 
)

Return the tile_id on current pelist. one-tile-per-pe is assumed.

Definition at line 1141 of file mpp_domains_util.inc.

◆ mpp_get_tile_npes()

integer function mpp_get_tile_npes ( type(domain2d), intent(in)  domain)

Returns number of processors used on current tile.

Definition at line 1192 of file mpp_domains_util.inc.

◆ mpp_get_tile_pelist()

subroutine mpp_get_tile_pelist ( type(domain2d), intent(in)  domain,
integer, dimension(:), intent(inout)  pelist 
)

Get the processors list used on current tile.

Definition at line 1213 of file mpp_domains_util.inc.

◆ mpp_get_update_pelist()

subroutine mpp_get_update_pelist ( type(domain2d), intent(in)  domain,
integer, intent(in)  action,
integer, dimension(:), intent(inout)  pelist,
integer, intent(in), optional  position 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 1346 of file mpp_domains_util.inc.

◆ mpp_get_update_size()

subroutine mpp_get_update_size ( type(domain2d), intent(in)  domain,
integer, intent(out)  nsend,
integer, intent(out)  nrecv,
integer, intent(in), optional  position 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 1318 of file mpp_domains_util.inc.

◆ mpp_global_field_free_comm()

subroutine mpp_global_field_free_comm ( type(domain2d), intent(in)  domain,
integer(i8_kind), intent(in)  l_addr,
integer, intent(in)  ksize,
integer(i8_kind), intent(in), optional  l_addr2,
integer, intent(in), optional  flags 
)

Definition at line 455 of file mpp_domains_comm.inc.

◆ mpp_global_field_init_comm()

type(domaincommunicator2d) function, pointer mpp_global_field_init_comm ( type(domain2d), intent(in), target  domain,
integer(i8_kind), intent(in)  l_addr,
integer, intent(in)  isize_g,
integer, intent(in)  jsize_g,
integer, intent(in)  isize_l,
integer, intent(in)  jsize_l,
integer, intent(in)  ksize,
integer(i8_kind), intent(in), optional  l_addr2,
integer, intent(in), optional  flags,
integer, intent(in), optional  position 
)

initializes a DomainCommunicator2D type for use in mpp_global_field

Definition at line 210 of file mpp_domains_comm.inc.

◆ mpp_group_update_initialized()

logical function mpp_group_update_initialized ( type(mpp_group_update_type), intent(in)  group)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 2388 of file mpp_domains_util.inc.

◆ mpp_group_update_is_set()

logical function mpp_group_update_is_set ( type(mpp_group_update_type), intent(in)  group)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 2397 of file mpp_domains_util.inc.

◆ mpp_is_nest_coarse()

logical function mpp_is_nest_coarse ( type(nest_domain_type), intent(in)  nest_domain,
integer, intent(in)  nest_level 
)

Definition at line 2570 of file mpp_define_nest_domains.inc.

◆ mpp_is_nest_fine()

logical function mpp_is_nest_fine ( type(nest_domain_type), intent(in)  nest_domain,
integer, intent(in)  nest_level 
)

Definition at line 2555 of file mpp_define_nest_domains.inc.

◆ mpp_modify_domain1d()

subroutine mpp_modify_domain1d ( type(domain1d), intent(in)  domain_in,
type(domain1d), intent(inout)  domain_out,
integer, intent(in), optional  cbegin,
integer, intent(in), optional  cend,
integer, intent(in), optional  gbegin,
integer, intent(in), optional  gend,
integer, intent(in), optional  hbegin,
integer, intent(in), optional  hend 
)

Modifies the exents of a domain.

Parameters
[in]domain_inThe source domain.
[in,out]domain_outThe returned domain.
[in]hendhalo size
[in]cendAxis specifications associated with the compute domain of the returned 1D domain.
[in]gendAxis specifications associated with the global domain of the returned 1D domain.

Definition at line 7545 of file mpp_domains_define.inc.

◆ mpp_modify_domain2d()

subroutine mpp_modify_domain2d ( type(domain2d), intent(in)  domain_in,
type(domain2d), intent(inout)  domain_out,
integer, intent(in), optional  isc,
integer, intent(in), optional  iec,
integer, intent(in), optional  jsc,
integer, intent(in), optional  jec,
integer, intent(in), optional  isg,
integer, intent(in), optional  ieg,
integer, intent(in), optional  jsg,
integer, intent(in), optional  jeg,
integer, intent(in), optional  whalo,
integer, intent(in), optional  ehalo,
integer, intent(in), optional  shalo,
integer, intent(in), optional  nhalo 
)
Parameters
[in]domain_inThe source domain.
[in,out]domain_outThe returned domain.
[in]jecZonal and meridional axis specifications associated with the global domain of the returned 2D domain.
[in]jegZonal axis specifications associated with the global domain of the returned 2D domain.
[in]nhalohalo size in x- and y- directions

Definition at line 7581 of file mpp_domains_define.inc.

◆ mpp_mosaic_defined()

logical function mpp_mosaic_defined

Accessor function for value of mosaic_defined.

Definition at line 1585 of file mpp_domains_define.inc.

◆ mpp_redistribute_free_comm()

subroutine mpp_redistribute_free_comm ( type(domain2d), intent(in)  domain_in,
integer(i8_kind), intent(in)  l_addr,
type(domain2d), intent(in)  domain_out,
integer(i8_kind), intent(in)  l_addr2,
integer, intent(in)  ksize,
integer, intent(in)  lsize 
)

Definition at line 435 of file mpp_domains_comm.inc.

◆ mpp_redistribute_init_comm()

type(domaincommunicator2d) function, pointer mpp_redistribute_init_comm ( type(domain2d), intent(in), target  domain_in,
integer(i8_kind), dimension(:), intent(in)  l_addrs_in,
type(domain2d), intent(in), target  domain_out,
integer(i8_kind), dimension(:), intent(in)  l_addrs_out,
integer, intent(in)  isize_in,
integer, intent(in)  jsize_in,
integer, intent(in)  ksize_in,
integer, intent(in)  isize_out,
integer, intent(in)  jsize_out,
integer, intent(in)  ksize_out 
)

Definition at line 28 of file mpp_domains_comm.inc.

◆ mpp_set_compute_domain1d()

subroutine mpp_set_compute_domain1d ( type(domain1d), intent(inout)  domain,
integer, intent(in), optional  begin,
integer, intent(in), optional  end,
integer, intent(in), optional  size,
logical, intent(in), optional  is_global 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 338 of file mpp_domains_util.inc.

◆ mpp_set_compute_domain2d()

subroutine mpp_set_compute_domain2d ( type(domain2d), intent(inout)  domain,
integer, intent(in), optional  xbegin,
integer, intent(in), optional  xend,
integer, intent(in), optional  ybegin,
integer, intent(in), optional  yend,
integer, intent(in), optional  xsize,
integer, intent(in), optional  ysize,
logical, intent(in), optional  x_is_global,
logical, intent(in), optional  y_is_global,
integer, intent(in), optional  tile_count 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 351 of file mpp_domains_util.inc.

◆ mpp_set_data_domain1d()

subroutine mpp_set_data_domain1d ( type(domain1d), intent(inout)  domain,
integer, intent(in), optional  begin,
integer, intent(in), optional  end,
integer, intent(in), optional  size,
logical, intent(in), optional  is_global 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 368 of file mpp_domains_util.inc.

◆ mpp_set_data_domain2d()

subroutine mpp_set_data_domain2d ( type(domain2d), intent(inout)  domain,
integer, intent(in), optional  xbegin,
integer, intent(in), optional  xend,
integer, intent(in), optional  ybegin,
integer, intent(in), optional  yend,
integer, intent(in), optional  xsize,
integer, intent(in), optional  ysize,
logical, intent(in), optional  x_is_global,
logical, intent(in), optional  y_is_global,
integer, intent(in), optional  tile_count 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 381 of file mpp_domains_util.inc.

◆ mpp_set_domain_symmetry()

subroutine mpp_set_domain_symmetry ( type(domain2d), intent(inout)  domain,
logical, intent(in)  symmetry 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 1732 of file mpp_domains_util.inc.

◆ mpp_set_global_domain1d()

subroutine mpp_set_global_domain1d ( type(domain1d), intent(inout)  domain,
integer, intent(in), optional  begin,
integer, intent(in), optional  end,
integer, intent(in), optional  size 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 398 of file mpp_domains_util.inc.

◆ mpp_set_global_domain2d()

subroutine mpp_set_global_domain2d ( type(domain2d), intent(inout)  domain,
integer, intent(in), optional  xbegin,
integer, intent(in), optional  xend,
integer, intent(in), optional  ybegin,
integer, intent(in), optional  yend,
integer, intent(in), optional  xsize,
integer, intent(in), optional  ysize,
integer, intent(in), optional  tile_count 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 409 of file mpp_domains_util.inc.

◆ mpp_set_super_grid_indices()

subroutine mpp_set_super_grid_indices ( type(domain_axis_spec), intent(inout)  grid)

Modifies the indices in the domain_axis_spec type to those of the supergrid.

Parameters
[in,out]griddomain_axis_spec type

Definition at line 276 of file mpp_domains_util.inc.

◆ mpp_shift_nest_domains()

subroutine mpp_shift_nest_domains ( type(nest_domain_type), intent(inout)  nest_domain,
type(domain2d), intent(in), target  domain,
integer, dimension(:), intent(in)  delta_i_coarse,
integer, dimension(:), intent(in)  delta_j_coarse,
integer, intent(in), optional  extra_halo 
)

Based on mpp_define_nest_domains, but just resets positioning of nest Modifies the parent/coarse start and end indices of the nest location Computes new overlaps of nest PEs on parent PEs Ramstrom/HRD Moving Nest.

Parameters
[in,out]nest_domainholds the information to pass data between nest and parent grids.
[in]domaindomain for the grid defined in the current pelist
[in]delta_i_coarseArray of deltas of coarse grid in y direction
[in]delta_j_coarseArray of deltas of coarse grid in y direction
[in]extra_haloExtra halo size

Definition at line 387 of file mpp_define_nest_domains.inc.

◆ nullify_domain2d_list()

subroutine nullify_domain2d_list ( type(domain2d), intent(inout)  domain)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 970 of file mpp_domains_util.inc.

◆ pop_key()

subroutine pop_key ( integer(i8_kind), dimension(:), intent(inout)  sorted,
integer, dimension(-1:), intent(inout)  idx,
integer, intent(inout)  n_idx,
integer, intent(in)  key_idx 
)

Definition at line 594 of file mpp_domains_comm.inc.

◆ print_nest_overlap()

subroutine print_nest_overlap ( type(overlap_type), intent(in)  overlap,
character(len=*), intent(in)  msg 
)

Definition at line 1506 of file mpp_define_nest_domains.inc.

◆ push_key()

integer function push_key ( integer(i8_kind), dimension(:), intent(inout)  sorted,
integer, dimension(-1:), intent(inout)  idx,
integer, intent(inout)  n_idx,
integer, intent(in)  insert,
integer(i8_kind), intent(in)  key,
integer, intent(in)  ival 
)

Definition at line 573 of file mpp_domains_comm.inc.

◆ search_bound_overlap()

type(overlapspec) function, pointer search_bound_overlap ( type(domain2d), intent(in)  domain,
integer, intent(in)  position 
)

This routine finds the bound at certain position.

Definition at line 1108 of file mpp_domains_util.inc.

◆ search_c2f_nest_overlap()

type(nestspec) function, pointer search_c2f_nest_overlap ( type(nest_domain_type), intent(inout)  nest_domain,
integer, intent(in)  nest_level,
integer, intent(in)  extra_halo,
integer, intent(in)  position 
)

Definition at line 1551 of file mpp_define_nest_domains.inc.

◆ search_check_overlap()

type(overlapspec) function, pointer search_check_overlap ( type(domain2d), intent(in)  domain,
integer, intent(in)  position 
)

this routine finds the check at certain position

Definition at line 1086 of file mpp_domains_util.inc.

◆ search_f2c_nest_overlap()

type(nestspec) function, pointer search_f2c_nest_overlap ( type(nest_domain_type), intent(inout)  nest_domain,
integer, intent(in)  nest_level,
integer, intent(in)  position 
)

Definition at line 1603 of file mpp_define_nest_domains.inc.

◆ search_update_overlap()

type(overlapspec) function, pointer search_update_overlap ( type(domain2d), intent(inout)  domain,
integer, intent(in)  whalo,
integer, intent(in)  ehalo,
integer, intent(in)  shalo,
integer, intent(in)  nhalo,
integer, intent(in)  position 
)

this routine found the domain has the same halo size with the input whalo, ehalo,

Definition at line 1029 of file mpp_domains_util.inc.

◆ set_bound_overlap()

subroutine set_bound_overlap ( type(domain2d), intent(inout)  domain,
integer, intent(in)  position 
)

set up the overlapping for boundary if the domain is symmetry.

Definition at line 6378 of file mpp_domains_define.inc.

◆ set_check_overlap()

subroutine set_check_overlap ( type(domain2d), intent(in)  domain,
integer, intent(in)  position 
)

set up the overlapping for boundary check if the domain is symmetry. The check will be done on current pe for east boundary for E-cell, north boundary for N-cell, East and North boundary for C-cell

Definition at line 6214 of file mpp_domains_define.inc.

◆ set_contact_point()

subroutine set_contact_point ( type(domain2d), intent(inout)  domain,
integer, intent(in)  position 
)

this routine sets the overlapping between tiles for E,C,N-cell based on T-cell overlapping

Definition at line 5966 of file mpp_domains_define.inc.

◆ set_domain_comm_inf()

subroutine set_domain_comm_inf ( type(overlapspec), intent(inout)  update)

Definition at line 8214 of file mpp_domains_define.inc.

◆ set_domain_id()

integer(i8_kind) function set_domain_id ( integer(i8_kind), intent(in)  d_id,
integer, intent(in)  ksize,
integer, intent(in), optional  flags,
integer, intent(in), optional  gtype,
integer, intent(in), optional  position,
integer, intent(in), optional  whalo,
integer, intent(in), optional  ehalo,
integer, intent(in), optional  shalo,
integer, intent(in), optional  nhalo 
)

Definition at line 707 of file mpp_domains_comm.inc.

◆ set_group_update()

subroutine set_group_update ( type(mpp_group_update_type), intent(inout)  group,
type(domain2d), intent(inout)  domain 
)

Set user stack size.

This sets the size of an array that is used for internal storage by mpp_domains. This array is used, for instance, to buffer the data sent and received in halo updates.
This call has implied global synchronization. It should be placed somewhere where all PEs can call it.

Definition at line 1919 of file mpp_domains_util.inc.

◆ set_overlaps()

subroutine set_overlaps ( type(domain2d), intent(in)  domain,
type(overlapspec), intent(in)  overlap_in,
type(overlapspec), intent(inout)  overlap_out,
integer, intent(in)  whalo_out,
integer, intent(in)  ehalo_out,
integer, intent(in)  shalo_out,
integer, intent(in)  nhalo_out 
)

this routine sets up the overlapping for mpp_update_domains for arbitrary halo update. should be the halo size defined in mpp_define_domains. xhalo_out, yhalo_out should not be exactly the same as xhalo_in, yhalo_in currently we didn't consider about tripolar grid situation, because in the folded north region, the overlapping is specified through list of points, not through rectangular. But will return back to solve this problem in the future.

Definition at line 4995 of file mpp_domains_define.inc.

◆ set_single_overlap()

subroutine set_single_overlap ( type(overlap_type), intent(in)  overlap_in,
type(overlap_type), intent(inout)  overlap_out,
integer, intent(in)  isoff,
integer, intent(in)  ieoff,
integer, intent(in)  jsoff,
integer, intent(in)  jeoff,
integer, intent(in)  index,
integer, intent(in)  dir,
integer, intent(in), optional  rotation 
)

Definition at line 5233 of file mpp_domains_define.inc.

Variable Documentation

◆ a2_sort_len

integer, save a2_sort_len =0
private

length sorted memory list

Definition at line 688 of file mpp_domains.F90.

◆ a_sort_len

integer, save a_sort_len =0
private

length sorted memory list

Definition at line 681 of file mpp_domains.F90.

◆ addr2_base

integer(i8_kind), parameter addr2_base = 65536_i8_kind
private

= 0x0000000000010000

Definition at line 684 of file mpp_domains.F90.

◆ addrs2_idx

integer, dimension(-1:max_addrs2), save addrs2_idx =-9999
private

index of addr2 associated with d_comm

Definition at line 687 of file mpp_domains.F90.

◆ addrs2_sorted

integer(i8_kind), dimension(max_addrs2), save addrs2_sorted =-9999
private

list of sorted local addresses

Definition at line 686 of file mpp_domains.F90.

◆ addrs_idx

integer, dimension(-1:max_addrs), save addrs_idx =-9999
private

index of address associated with d_comm

Definition at line 680 of file mpp_domains.F90.

◆ addrs_sorted

integer(i8_kind), dimension(max_addrs), save addrs_sorted =-9999
private

list of sorted local addresses

Definition at line 679 of file mpp_domains.F90.

◆ complete_group_update_on

logical complete_group_update_on = .false.
private

Definition at line 675 of file mpp_domains.F90.

◆ complete_update

logical complete_update = .false.
private

Definition at line 670 of file mpp_domains.F90.

◆ current_id_update

integer current_id_update = 0
private

Definition at line 664 of file mpp_domains.F90.

◆ d_comm

type(domaincommunicator2d), dimension(:), allocatable, target, save d_comm
private

domain communicators

Definition at line 702 of file mpp_domains.F90.

◆ d_comm_idx

integer, dimension(-1:max_fields), save d_comm_idx =-9999
private

index of d_comm associated with sorted addresses

Definition at line 703 of file mpp_domains.F90.

◆ dc_sort_len

integer, save dc_sort_len =0
private

length sorted comm keys (=num active communicators)

Definition at line 705 of file mpp_domains.F90.

◆ dckey_sorted

integer(i8_kind), dimension(max_fields), save dckey_sorted =-9999
private

list of sorted local addresses

Definition at line 699 of file mpp_domains.F90.

◆ debug

logical debug = .FALSE.
private

Definition at line 656 of file mpp_domains.F90.

◆ debug_message_passing

logical debug_message_passing = .false.
private

Will check the consistency on the boundary between processor/tile when updating domain for symmetric domain and check the consistency on the north folded edge.

Definition at line 737 of file mpp_domains.F90.

◆ debug_update_domain

character(len=32) debug_update_domain = "none"
private

namelist interface

when debug_update_domain = none, no debug will be done. When debug_update_domain is set to fatal, the run will be exited with fatal error message When debug_update_domain is set to warning, the run will output warning message. When debug update_domain is set to note, the run will output some note message.

Definition at line 730 of file mpp_domains.F90.

◆ debug_update_level

integer debug_update_level = NO_CHECK
private

Definition at line 755 of file mpp_domains.F90.

◆ domain_clocks_on

logical domain_clocks_on =.FALSE.
private

Definition at line 718 of file mpp_domains.F90.

◆ domain_cnt

integer(i8_kind) domain_cnt =0
private

Definition at line 715 of file mpp_domains.F90.

◆ efp_sum_overflow_check

logical efp_sum_overflow_check = .false.
private

If .true., always do overflow_check when doing EFP bitwise mpp_global_sum.

Definition at line 746 of file mpp_domains.F90.

◆ field_s

integer, parameter field_s = 0
private

Definition at line 219 of file mpp_domains.F90.

◆ field_x

integer, parameter field_x = 1
private

Definition at line 220 of file mpp_domains.F90.

◆ field_y

integer, parameter field_y = 2
private

Definition at line 221 of file mpp_domains.F90.

◆ group_pack_clock

integer group_pack_clock =0
private

Definition at line 725 of file mpp_domains.F90.

◆ group_recv_clock

integer group_recv_clock =0
private

Definition at line 725 of file mpp_domains.F90.

◆ group_send_clock

integer group_send_clock =0
private

Definition at line 725 of file mpp_domains.F90.

◆ group_unpk_clock

integer group_unpk_clock =0
private

Definition at line 725 of file mpp_domains.F90.

◆ group_update_buffer_pos

integer group_update_buffer_pos = 0
private

Definition at line 674 of file mpp_domains.F90.

◆ group_wait_clock

integer group_wait_clock =0
private

Definition at line 725 of file mpp_domains.F90.

◆ gt_base

integer(i8_kind), parameter gt_base = 256_i8_kind
private

Definition at line 710 of file mpp_domains.F90.

◆ i_sort_len

integer, save i_sort_len =0
private

length sorted domain ids list

Definition at line 694 of file mpp_domains.F90.

◆ ids_idx

integer, dimension(-1:max_dom_ids), save ids_idx =-9999
private

index of d_comm associated with sorted addesses

Definition at line 693 of file mpp_domains.F90.

◆ ids_sorted

integer(i8_kind), dimension(max_dom_ids), save ids_sorted =-9999
private

list of sorted domain identifiers

Definition at line 692 of file mpp_domains.F90.

◆ ke_base

integer(i8_kind), parameter ke_base = 281474976710656_i8_kind
private

Definition at line 713 of file mpp_domains.F90.

◆ max_addrs

integer, parameter max_addrs =512
private

Definition at line 678 of file mpp_domains.F90.

◆ max_addrs2

integer, parameter max_addrs2 =128
private

Definition at line 685 of file mpp_domains.F90.

◆ max_dom_ids

integer, parameter max_dom_ids =128
private

Definition at line 691 of file mpp_domains.F90.

◆ max_fields

integer, parameter max_fields =1024
private

Definition at line 698 of file mpp_domains.F90.

◆ max_nonblock_update

integer, parameter max_nonblock_update = 100
private

Definition at line 672 of file mpp_domains.F90.

◆ maxlist

integer, parameter maxlist = 100
private

Definition at line 217 of file mpp_domains.F90.

◆ maxoverlap

integer, parameter maxoverlap = 200
private

Definition at line 218 of file mpp_domains.F90.

◆ module_is_initialized

logical module_is_initialized = .false.
private

Definition at line 655 of file mpp_domains.F90.

◆ mosaic_defined

logical mosaic_defined = .false.
private

Definition at line 658 of file mpp_domains.F90.

◆ mpp_domains_stack_hwm

integer mpp_domains_stack_hwm =0
private

Definition at line 660 of file mpp_domains.F90.

◆ mpp_domains_stack_size

integer mpp_domains_stack_size =0
private

Definition at line 659 of file mpp_domains.F90.

◆ n_addrs

integer, save n_addrs =0
private

number of memory addresses used

Definition at line 682 of file mpp_domains.F90.

◆ n_addrs2

integer, save n_addrs2 =0
private

number of memory addresses used

Definition at line 689 of file mpp_domains.F90.

◆ n_comm

integer, save n_comm =0
private

number of communicators used

Definition at line 707 of file mpp_domains.F90.

◆ n_ids

integer, save n_ids =0
private

number of domain ids used (=i_sort_len; domain ids are never removed)

Definition at line 695 of file mpp_domains.F90.

◆ nest_pack_clock

integer nest_pack_clock =0
private

Definition at line 724 of file mpp_domains.F90.

◆ nest_recv_clock

integer nest_recv_clock =0
private

Definition at line 723 of file mpp_domains.F90.

◆ nest_send_clock

integer nest_send_clock =0
private

Definition at line 723 of file mpp_domains.F90.

◆ nest_unpk_clock

integer nest_unpk_clock =0
private

Definition at line 723 of file mpp_domains.F90.

◆ nest_wait_clock

integer nest_wait_clock =0
private

Definition at line 724 of file mpp_domains.F90.

◆ no_check

integer, parameter no_check = -1
private

Definition at line 754 of file mpp_domains.F90.

◆ nonblock_buffer_pos

integer nonblock_buffer_pos = 0
private

Definition at line 667 of file mpp_domains.F90.

◆ nonblock_data

type(nonblock_type), dimension(:), allocatable nonblock_data
private

Definition at line 671 of file mpp_domains.F90.

◆ nonblock_group_buffer_pos

integer nonblock_group_buffer_pos = 0
private

Definition at line 668 of file mpp_domains.F90.

◆ nonblock_group_pack_clock

integer nonblock_group_pack_clock =0
private

Definition at line 726 of file mpp_domains.F90.

◆ nonblock_group_recv_clock

integer nonblock_group_recv_clock =0
private

Definition at line 726 of file mpp_domains.F90.

◆ nonblock_group_send_clock

integer nonblock_group_send_clock =0
private

Definition at line 726 of file mpp_domains.F90.

◆ nonblock_group_unpk_clock

integer nonblock_group_unpk_clock =0
private

Definition at line 727 of file mpp_domains.F90.

◆ nonblock_group_wait_clock

integer nonblock_group_wait_clock =0
private

Definition at line 727 of file mpp_domains.F90.

◆ nthread_control_loop

integer nthread_control_loop = 8
private

Determine the loop order for packing and unpacking. When number of threads is greater than nthread_control_loop, the k-loop will be moved outside and combined with number of pack and unpack. When the number of threads is less than or equal to nthread_control_loop, the k-loop is moved inside, but still outside, of j,i loop.

Definition at line 740 of file mpp_domains.F90.

◆ null_domain1d

type(domain1d), save, public null_domain1d

Definition at line 661 of file mpp_domains.F90.

◆ null_domain2d

type(domain2d), save, public null_domain2d

Definition at line 662 of file mpp_domains.F90.

◆ null_domainug

type(domainug), save, public null_domainug

Definition at line 663 of file mpp_domains.F90.

◆ num_nonblock_group_update

integer num_nonblock_group_update = 0
private

Definition at line 666 of file mpp_domains.F90.

◆ num_update

integer num_update = 0
private

Definition at line 665 of file mpp_domains.F90.

◆ pack_clock

integer pack_clock =0
private

Definition at line 720 of file mpp_domains.F90.

◆ pe

integer pe
private

Definition at line 654 of file mpp_domains.F90.

◆ recv_clock

integer recv_clock =0
private

Definition at line 719 of file mpp_domains.F90.

◆ recv_clock_nonblock

integer recv_clock_nonblock =0
private

Definition at line 721 of file mpp_domains.F90.

◆ send_clock

integer send_clock =0
private

Definition at line 719 of file mpp_domains.F90.

◆ send_pack_clock_nonblock

integer send_pack_clock_nonblock =0
private

Definition at line 721 of file mpp_domains.F90.

◆ start_update

logical start_update = .true.
private

Definition at line 669 of file mpp_domains.F90.

◆ unpk_clock

integer unpk_clock =0
private

Definition at line 719 of file mpp_domains.F90.

◆ unpk_clock_nonblock

integer unpk_clock_nonblock =0
private

Definition at line 721 of file mpp_domains.F90.

◆ use_alltoallw

logical use_alltoallw = .false.
private

Definition at line 748 of file mpp_domains.F90.

◆ verbose

logical verbose =.FALSE.
private

Definition at line 657 of file mpp_domains.F90.

◆ wait_clock

integer wait_clock =0
private

Definition at line 720 of file mpp_domains.F90.

◆ wait_clock_nonblock

integer wait_clock_nonblock =0
private

Definition at line 722 of file mpp_domains.F90.