NCAR atmospheric component LANL ocean, land ice and sea ice - - PowerPoint PPT Presentation

ncar atmospheric component lanl ocean land ice and sea
SMART_READER_LITE
LIVE PREVIEW

NCAR atmospheric component LANL ocean, land ice and sea ice - - PowerPoint PPT Presentation

NCAR atmospheric component LANL ocean, land ice and sea ice MPAS-Atmosphere: Nonhydrostatic global atmospheric model Time integration as in Advanced Research WRF Spatial discretization similar to ARW except for Voronoi mesh


slide-1
SLIDE 1

NCAR: Bill Skamarock, Joe Klemp, Michael Duda, Laura Fowler, Sang-Hun Park MPAS-Atmosphere:

  • Nonhydrostatic global atmospheric model
  • Time integration as in Advanced Research WRF
  • Spatial discretization similar to ARW except for

Voronoi mesh accommodations. NCAR atmospheric component

  • LANL – ocean, land ice and sea ice
slide-2
SLIDE 2

Motivation Mesh flexibility: North American refinement

slide-3
SLIDE 3

Motivation Mesh flexibility: Refinement for equatorial convection

slide-4
SLIDE 4

Motivation Mesh flexibility: Refinement around the Andes

slide-5
SLIDE 5

Unstructured spherical centroidal Voronoi meshes

  • Mostly hexagons, some pentagons and 7-sided cells
  • Cell centers are at cell center-of-mass (centroidal).
  • Cell edges bisect and are orthogonal to the lines

connecting cell centers.

  • Uniform resolution – traditional icosahedral mesh.

Centroidal Voronoi Meshes

C-grid

  • Solve for normal velocities on cell edges.
  • Gradient operators in the horizontal momentum

equations are 2nd-order accurate.

  • Velocity divergence is 2nd-order accurate for

edge-centered velocities.

slide-6
SLIDE 6

Given an initial set of generating points, Lloyd’s method may be used to arrive at a CVT:

  • 1. Begin with any set of

initial points (the generating point set)

  • 2. Construct a Voronoi

diagram for the set

  • 3. Locate the mass centroid
  • f each Voronoi cell
  • 4. Move each generating

point to the mass centroid

  • f its Voronoi cell
  • 5. Repeat 2-4 to convergence

From Du et al. (1999)

MacQueen’s method, a randomized alternative to Lloyd’s method, may also be used; no Voronoi diagrams need to be constructed, but convergence is generally much slower.

Centroidal Voronoi Meshes: Lloyd’s Method

slide-7
SLIDE 7

Centroidal Voronoi Meshes: Mesh Generation

Mesh generation beginning from an icosahedral mesh. All points are free.

slide-8
SLIDE 8

Centroidal Voronoi Meshes: Mesh Generation

Mesh generation beginning from an icosahedral mesh with constrained

  • refinement. Only

points in the refinement region are free.

slide-9
SLIDE 9

MPAS Meshes

slide-10
SLIDE 10

Parallel decomposition

Graph partitioning

The dual mesh of a Voronoi tessellation is a Delaunay triangulation – essentially the connectivity graph of the cells Parallel decomposition of an MPAS mesh then becomes a graph partitioning problem: equally distribute nodes among partitions (give each process equal work) while minimizing the edge cut (minimizing parallel communication) We use the Metis package for parallel graph decomposition

  • Currently done as a pre-processing step, but could be done

“on-line”

Metis also handles weighted graph partitioning

  • Given a priori estimates for the computational costs of each

grid cell, we can better balance the load among processes

slide-11
SLIDE 11

Parallel decomposition

Block of cells owned by a process Block plus two layers of halo/ghost cells

DM parallel decomposition

  • f the MPAS mesh

MPAS uses halo (or ghost) cells; Halo values are communicated (currently MPI) Multithreading: Block of cells (shared memory) divided into nThreads threads and advanced in parallel (using OpenMP) nThreads = 3

slide-12
SLIDE 12

Given an assignment of cells to a process, any number of layers of halo (ghost) cells may be added

Block of cells owned by a process Block plus one layer of halo/ghost cells Block plus two layers of halo/ghost cells

Cells are stored in a 1d array (2d with vertical dimension, etc.), with halo cells at the end of the array; the order

  • f real cells may be updated to provide better cache re-use

With a complete list of cells stored in a block, adjacent edge and vertex locations can be found; we apply a simple rule to determine ownership

  • f edges and vertices adjacent

to real cells in different blocks

Parallel decomposition

slide-13
SLIDE 13

Variables: Prognostic equations: Diagnostics and definitions: Vertical coordinate:

Equations

  • Prognostic equations for coupled

variables.

  • Generalized height coordinate.
  • Horizontally vector invariant eqn set.
  • Continuity equation for dry air mass.
  • Thermodynamic equation for coupled

potential temperature.

Time integration scheme

As in Advanced Research WRF - Split-explicit Runge-Kutta (3rd order)

Nonhydrostatic formulation

Nonhydrostatic Atmospheric Solver

Shallow atmosphere:

slide-14
SLIDE 14

Multiple passes of simple Laplacian smoother at each level:

Smoothed Terrain-Following (STF) hybrid Coordinate

STF progressively smooths coordinate surfaces while transitioning to a height coordinate

BTF STF

Specification of terrain:

  • High resolution terrain data (30 arcsec) averaged over grid-cell area
  • Terrain smoothing with one pass of a 4th order Laplacian

Controls rate at which terrain influences are attenuated with height Terrain influence that represents increased smoothing of the actual terrain with height

MPAS Vertical Mesh

slide-15
SLIDE 15

MPAS -Tibetan Plateau, 28o N

15 km grid 7.5 km grid Basic terrain-following (BTF) coordinate Smoothed hybrid terrain-following (STF) coordinate 3 km grid

(Model top is at 30 km)

slide-16
SLIDE 16

Nonhydrostatic Atmospheric Solver

Prognostic equations: (1) Gradient operators (2) Flux divergence operators (3) Nonlinear Coriolis term

slide-17
SLIDE 17

On the Voronoi mesh, P1P2 is perpendicular to v1v2 and is bisected by v1v2, hence Px ~ (P2-P1)/ dxe is 2nd order accurate.

Operators on the Voronoi Mesh Pressure and KE gradients

slide-18
SLIDE 18

Operators on the Voronoi Mesh cell-center KE evaluation

Cell center kinetic energy: KEi Vertex kinetic energy: KEv

slide-19
SLIDE 19

Operators on the Voronoi Mesh cell-center KE evaluation

Cell center kinetic energy: KEi Vertex kinetic energy: KEv

slide-20
SLIDE 20

Operators on the Voronoi Mesh cell-center KE evaluation

Cell center kinetic energy: KEi Vertex kinetic energy: KEv

slide-21
SLIDE 21

Operators on the Voronoi Mesh cell-center KE evaluation

Cell center kinetic energy: KEi Vertex kinetic energy: KEv

slide-22
SLIDE 22

Transport equation, conservative form: Finite-Volume formulation, Integrate over cell: Apply divergence theorem: Discretize in time and space: Velocity divergence operator is 2nd-order accurate for edge-centered velocities.

Operators on the Voronoi Mesh Flux divergence and transport

slide-23
SLIDE 23

Operators on the Voronoi Mesh Flux divergence and transport

  • 1. Scalar edge-flux value ψ is the weighted sum
  • f cell values from cells that share edge and

all their neighbors.

  • 2. An individual edge-flux is used to update the

two cells that share the edge.

  • 3. Three edge-flux evaluations and cell updates

are needed to complete the Runge-Kutta timestep.

  • 4. Monotonic constraint requires checking the

cell-value update and renormalizing edge- fluxes if the cell updates are outside specific bounds (on the final RK3 update). Scalar transport equation for cell i:

slide-24
SLIDE 24

Nonlinear term:

The general tangential velocity reconstruction produces a consistent divergence on the primal and dual grids, and allows for PV, enstrophy and energy* conservation in the nonlinear SW solver.

Operators on the Voronoi Mesh ‘Nonlinear’ Coriolis force

Tangential velocity reconstruction:

slide-25
SLIDE 25

Operators on the Voronoi Mesh ‘Nonlinear’ Coriolis force

Example: absolute vorticity at vertex a Example: absolute vorticity at e13

slide-26
SLIDE 26

Default time integration

Time Integration Dynamics and Scalar Transport Options

Allows for smaller dynamics timesteps relative to scalar transport timestep and main physics timestep. We can use any transport scheme here (we are not tied to an RK-based scheme). Scalar transport and physics are the expensive pieces in most applications. Call physics Do dynamics_split_steps Do step_rk3 = 1, 3 compute large-time-step tendency Do acoustic_steps update u update rho, theta and w End acoustic_steps End rk3 step End dynamics_split_steps Do scalar step_rk3 = 1, 3 scalar RK3 transport End scalar rk3 step Call microphysics

slide-27
SLIDE 27

MPAS Release – V5.0, January 2017

Surface Layer: (Monin Obukhov): module_sf_sfclay.F as in WRF 3.8.1 PBL: YSU as in WRF 3.8.1 Land Surface Model (NOAH 4-layers): as in WRF 3.3.1. Gravity Wave Drag: YSU gravity wave drag scheme. Convection: new Tiedtke (nTiedtke), as in WRFV3.8.1 Microphysics: WSM6: as in WRF 3.8.1 Radiation: RRTMG sw as in WRF 3.8.1; RRTMG lw as in WRF 3.8.1 (1) mesoscale_reference physics suite (2) scale_aware_CP (Convection-Permitting) physics suite Surface Layer: module_sf_mynn.F as in WRF 3.6.1 PBL: Mellor-Yamada-Nakanishi-Niino (MYNN) as in WRF 3.6.1 Land Surface Model (NOAH 4-layers): as in WRF 3.3.1. Gravity Wave Drag: YSU gravity wave drag scheme, WRF 3.6.1 Convection: Grell-Freitas scale aware scheme (modified from WRF 3.6.1) Microphysics: Thompson scheme (non-aeosol aware): as in WRF 3.8.1 Radiation: RRTMG sw as in WRF 3.8.1; RRTMG lw as in WRF 3.8.1

slide-28
SLIDE 28

15 km uniform resolution mesh 15-60 km variable resolution mesh

10-day 500 hPa Relative Vorticity Forecast

16 km 36 km 56 km

2 1

  • 1
  • 2

s-1 x 104

MPAS Physics:

  • WSM6 cloud microphysics
  • Tiedtke convection scheme
  • Monin-Obukhov surface layer
  • YSU PBL
  • Noah land-surface
  • RRTMG lw and sw.
slide-29
SLIDE 29

MPAS Physics:

  • WSM6 cloud microphysics
  • Grell-Freitas convection scheme

(scale-aware)

  • Monin-Obukhov surface layer
  • MYNN PBL
  • Noah land-surface
  • RRTMG lw and sw.

MPAS mesh: 50 – 3 km variable resolution. CONUS is the 3 km region. Very smooth transition.

3-50 km mesh, ∆x contours 4, 8, 12, 20, 30 40 km approximately 6.85 million cells 68% have < 4 km spacing (158 pentagons, 146 septagons)

MPAS mesh mean cell spacing (km)

Hazardous Weather Testbed Spring Experiment 2015 Forecasts Results from MPAS

slide-30
SLIDE 30

Hazardous Weather Testbed Spring Experiment 2015 Forecasts Results from MPAS

1 km AGL reflectivity Forecasts valid 2015-05-17 6 UTC 6 h forecast 30 h forecast 54 h forecast 78 h forecast 102 h forecast

Reflectivity NOAA SPC archive 2015-05-17 06 UTC

slide-31
SLIDE 31

Resolving Atmospheric Convection

D d << D Updraft diameter: D Eddies responsible for entrainment/detrainment: diameter d << D Mesh spacing needed to resolve turbulent eddies: h << d, D D: Severe convection - 5-8 km Typical midlatitude cells - 2-4 km Tropical cells - 1-2 km Shallow convection - 0.1-1 km Resolutions needed to resolve deep convection: h ~ O(100 m) Resolutions needed to resolve shallow convection: h ~ O(10 m)

slide-32
SLIDE 32

Skew T –log P diagrams for conditionally unstable (left) warm-season continental midlatitude and (right) oceanic tropical environments.

Adapted from Trier, S.B., 2003: Convective storms: Convective initiation. Encyclopedia of Atmospheric Sciences, Academic Press, 560-569; Figure 1.

It is not surprising that tropical convection is more difficult to simulate

Resolving Atmospheric Convection

slide-33
SLIDE 33

We are not resolving convection with ∆x ~ O(km). A few large, deep, laminar explicit cells act as the implicit parameterization of many smaller cells possessing a spectrum

  • f sizes and depths.

Parameterizations: We wish to retain the beneficial aspects of the explicitly simulated cells, while correcting for their deficiencies through the parameterization(s). This is a non-standard parameterization problem. Q: How are explicit simulations at ∆x ~ O(km) deficient in different convective regimes? Q: What approaches are needed for constructing effective convective parameterizations for ∆x ~ O(km)?

George Bryan (2008)

Resolving Atmospheric Convection

slide-34
SLIDE 34

Summary

  • MPAS-Atmosphere, using C-grid centroidal Voronoi meshes and FV

formulation, appears viable for NWP and climate applications.

  • Variable-resolution results for MPAS solvers are encouraging.
  • Initial MPI implementations of MPAS-A are showing efficiencies and

scalings comparable to other models (e.g. WRF).

  • MPAS-A: Our use of variable-resolution meshes is leading us to consider

scale-awareness issues in our physics. Challenges: (1) physics for hydrostatic-nonhydrostatic mesh. (2) physics for long-term hydrostatic-scale variable-mesh integrations.

  • Other challenges: DA for variable-resolution meshes.
  • Convection at convection-permitting resolutions?

Further information and to access MPAS Version 5.0: http://mpas-dev.github.io/