parallel computations for weather research and
play

Parallel computations for weather research and environment - PowerPoint PPT Presentation

Parallel computations for weather research and environment protection Alexander V. Starchenko Tomsk State University, Institute of Atmospheric Optics Introduction One of the methods for research and forecast developing above a bounded


  1. Parallel computations for weather research and environment protection Alexander V. Starchenko Tomsk State University, Institute of Atmospheric Optics

  2. Introduction One of the methods for research and forecast developing above a bounded territory local atmospheric processes and pollution transport is a mathematical modelling, based on application of mesoscale meteorological models and regional environmental models.

  3. Historical facts... • British mathematician L.F.Richardson first proposed numerical weather prediction in 1922. Richardson attempted to perform a numerical forecast but it was not successful. He and Russian scientist A.A.Freedman were first in 20th year of the last century, who understand a necessity of application of high performance resources for meteorological forecasts. • The first successful numerical prediction was performed in 1950 by a team composed of the American meteorologist J. Charney, Norwegian meteorologist Ragnar Fj ö rtoft and applied mathematician J. Neumann, using the ENIAC digital computer. They used a simplified form of atmospheric dynamics based on the barotropic vorticity equation. This simplification greatly reduced demands on computer time and memory, so that the computations could be performed on the relatively primitive computers available at the time. Later models used more complete equations for atmospheric dynamics and thermodynamics. • Operational numerical weather prediction (i.e., routine predictions for practical use) began in 1955 under a joint project by the U.S. Air Force, Navy, and Weather Bureau

  4. Mesoscale models and processes • A mesoscale model is a numerical weather prediction model with sufficiently high horizontal and vertical resolution to forecast mesoscale weather phenomena. These phenomena are often forced by topography or coastlines, or are related to convection. • Most severe weather occurs at the mesoscale, including tornadoes and mesoscale convective systems. Visibility, turbulence, sensible weather can vary enormously over just a few kilometers and have a tremendous impact on operations.

  5. Mesoscale models include, as a rule, unsteady three-dimensional equations of hydro- thermodynamics and differ by various approaches of parameterisation of atmospheric processes.

  6. Basic governing equations of MM Continuity equation ∂ ρ ∂ ρ ∂ ρ ∂ ρ ( u ) ( v ) ( w ) + + + = 0 ∂ ∂ ∂ ∂ t x y z Equations of hydrodynamics    ∂ ∂ ∂ ∂  ∂ ∂ ∂ ∂ ∂   2 2 u u u u p u u u     ρ + + + = − + ρ + Γ + + Γ   m u v w fv     ∂ ∂ ∂ ∂ ∂ ∂ ∂ ∂ ∂ H Z   2 2     t x y z x x y z z    ∂ ∂ ∂ ∂  ∂ ∂ ∂ ∂ ∂   2 2 v v v v p v v v     ρ + + + = − − ρ + Γ + + Γ   m u v w fu     ∂ ∂ ∂ ∂ ∂ ∂ ∂ ∂ ∂ H Z   2 2     t x y z y x y z z    ∂ ∂ ∂ ∂  ∂ ∂ ∂ ∂ ∂   2 2 w w w w p w w w     ρ + + + = − − ρ + Γ + + Γ   m u v w g     ∂ ∂ ∂ ∂ ∂ ∂ ∂ ∂ ∂ H Z   2 2     t x y z z x y z z

  7. Basic governing equations of MM Energy equation     ∂ θ ∂ θ ∂ θ ∂ θ ∂ θ ∂ θ ∂ ∂ θ   2 2     ρ + + + = Γ + + Γ + + Φ   h  u v w  R   θ ∂ ∂ ∂ ∂ ∂ ∂ ∂ ∂ H Z   2 2     t x y z x y z z Humidity equation     ∂ ∂ ∂ ∂ ∂ ∂ ∂ ∂ ∂ ∂     q q q q q q q ρ  + + +  = Γ +  Γ  + Γ + Φ     q q q u v w     ∂ ∂ ∂ ∂ ∂ ∂ ∂ ∂ ∂ ∂ H H Z q         t x y z x x y y z z Equation of state   − 1 q q = ρ = +   θ = R / c p RT , R R T ( p / p ) p 0   0 M M   air H O 2

  8. Regional models of air pollution transport Modern photochemical dispersion models are based on Eulerian approach that allows for an integrated “one-atmosphere” assessm of gaseous and particulate air pollution over many scales rangin from sub-urban to continental. These models simulate the emission, dispersion, chemical reaction and removal of pollutants in the troposphere by solving the pollutant continuity equation for each chemical species on a system of nested three-dimensional grids. Pollutant continuity equations     ∂ ∂ ∂ ∂ ∂ ∂ ∂ ∂ ∂ ∂     c c c c c c c ρ  + + +  = Γ +  Γ  + Γ + Φ     c c c u v w     ∂ ∂ ∂ ∂ ∂ ∂ ∂ ∂ ∂ ∂ H H Z c         t x y z x x y y z z

  9. In general, the considered conservation laws for mass, impulse and energy may be written as follows: ∂ C + Κ + Κ + Κ = ∆ + ∆ + ∆ + f ∂ x y z x y z t Here С is a generalized variable: С = 1, u, v, w, θ , q, c . Κ , Δ are advection and diffusion terms of equation. The equations with initial and boundary conditions for dependent variables and information on continua properties comes to mathamatical formulation of the problem. Computer realization of the models is based on application of non-trivial numerical algorithms and requires high performance computational resources.

  10. Numerical methods for solution Grids: equidistant grid in longitude and latitude, non-equidistant in vertical direction ( σ -co-ordinate). High order approximation of derivatives in space Implicit finite-differencing scheme + − n 1 n ( ) C C + + + + Κ + Κ + Κ = Λ + Λ + Λ + i , j , k i , j , k n 1 1 1 n n ( ) f τ x y z x y z i , j , k i , j , k i , j , k Absolutely stable, non-linearity of differencing equations Explicit-implicit differencing scheme + − n 1 n C C + Κ + Κ + Κ = Λ + Λ + Λ + + i , j , k i , j , k n n n 1 n ( ) ( ) f τ x y z i , j , k x y i , j , k z , i , j , k i , j , k Conditionally stable: τ (sec)<3dx(km), sweeps in vertical co-ordinate

  11. Technology of computations in nested domains Tomsk Ob river Т о м с к Т о м с к А н ж е р о - С у д ж е н с к А н ж е р о - С у д ж е н с к Tom river Ю р г а Ю р г а Kemerovo К е м е р о в о К е м е р о в о Tomsk Region Tomsk City

  12. Examples of ММ : Mesoscale Model 5 • The MM5 mesoscale model has been developed at Penn State and NCAR as a community mesoscale model. • The MM5 includes a multiple-nest capability, nonhydrostatic dynamics, which allows the model to be used at a few-kilometer scale, multitasking capability on shared- and distributed-memory machines, more physics options. • The MM5 is applied to various supercomputers such as Compaq Cluster, CrayT3E, IBM SP, SGI Origin, SUN, PC Linux Cluster. DM parallelization is based on 2D domain decomposition and MPI standard.

  13. MM5 is installed on Linux clusters at TSU and IAO Financial support of RFBR, grant N 04-07-90219

  14. Examples of ММ : WRF • The WRF is being developed as a collaborative effort among the NCAR, NCEP and others. • The WRF model is designed to be a flexible, state-of- the-art, portable code that is efficient in a massively parallel computing environment, which includes support for moving nested grids. • It offers numerous physics options and is suitable for use in a broad spectrum of applications across scales ranging from meters to thousands of kilometers. • It is efficient execution on a range of computing platforms (distributed and shared memory, vector and scalar types).

  15. Application of the MM5 & WRF D1 D1 D2 D2 D3 Tomsk D3 Kemerovo Kemerovo Novosibirsk Novosibirsk Three nested domains D1 (450 х 450km), D2 (150 х 150km), D3 (50 х 50km) and distribution of land use categories: Water , few vegatation , mixed forest , evergreen forest , urban area

  16. Application of the MM5 & WRF MM5 WRF MM5 WRF • Grids 52 х 52 х 31 for • Grids 52 х 52 х 31 for domains D1, D2, D3 domains D1, D2, D3 • Horizontal resolution: • Horizontal resolution: 9; 3; 1 km for D1, D2, 9; 3; 1 km for D1, D2, D3 D3 • Temporal step: 27; 9; • Temporal step: 60; 30; 3 sec for D1, D2, D3 10 sec for D1, D2, D3 • Vertical size of • Vertical size of domains: 17km domains: 17 km • Cluster IAO & TSU • Cluster IAO SB RAS

  17. MM5 domain decomposition for DM parallel Number of processors is equal 10 = 5 x 2

  18. Distributed Memory parallel system at IAO • Number of nodes = 20 • Node: 2*Pentium III 1GHz, RAM 1Gb • Network: Gigabit Ethernet • Rpeak=20Gflops • RLinpack=10,5Gflops • 2001

  19. Distributed Memory parallel system at TSU • Number of nodes=283 • Node: 2*Intel Xeon Woodcrest 2,66GHz, RAM 4Gb, HDD80Gb • Network: Infinipath • Rpeak=12000Gflops • RLinpack=9013Gflops • 2007

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend