Reservoir Simulation of Million-Cell Models on Desktop Computers - - PowerPoint PPT Presentation

reservoir simulation of million cell models on desktop
SMART_READER_LITE
LIVE PREVIEW

Reservoir Simulation of Million-Cell Models on Desktop Computers - - PowerPoint PPT Presentation

Reservoir Simulation of Million-Cell Models on Desktop Computers KnutAndreas Lie SINTEF ICT, Dept. Applied Mathematics The 22nd Kongsberg Seminar, May 2009 Applied Mathematics 07/05/2009 1/26 Physical Scales in Porous Media Flow ... one


slide-1
SLIDE 1

Reservoir Simulation of Million-Cell Models on Desktop Computers

Knut–Andreas Lie

SINTEF ICT, Dept. Applied Mathematics

The 22nd Kongsberg Seminar, May 2009

Applied Mathematics 07/05/2009 1/26

slide-2
SLIDE 2

Physical Scales in Porous Media Flow

... one cannot resolve them all at once

The scales that impact fluid flow in oil reservoirs range from the micrometer scale of pores and pore channels via dm–m scale of well bores and laminae sediments to sedimentary structures that stretch across entire reservoirs.

Applied Mathematics 07/05/2009 2/26

slide-3
SLIDE 3

Geological Models

Articulation of the geologists’ perception of the reservoir

Geological models: here: geo-cellular models describe the reservoir geometry (horizons, faults, etc) typically generated using geostatistics give rock parameters (permeability and porosity)

Applied Mathematics 07/05/2009 3/26

slide-4
SLIDE 4

Geological Models

Articulation of the geologists’ perception of the reservoir

Geological models: here: geo-cellular models describe the reservoir geometry (horizons, faults, etc) typically generated using geostatistics give rock parameters (permeability and porosity) Rock parameters: have a multiscale structure details on all scales impact flow permeability spans many orders of magnitude

Ex: Brent sequence Tarbert Upper Ness Applied Mathematics 07/05/2009 3/26

slide-5
SLIDE 5

Flow Simulation

Gap in resolution and model sizes

Gap in resolution: Geomodels: 107 − 109 cells Simulators: 105 − 106 cells

Applied Mathematics 07/05/2009 4/26

slide-6
SLIDE 6

Flow Simulation

Gap in resolution and model sizes

Gap in resolution: Geomodels: 107 − 109 cells Simulators: 105 − 106 cells − → sector models and/or upscaling of parameters

Coarse grid blocks:

Flow problems:

Applied Mathematics 07/05/2009 4/26

slide-7
SLIDE 7

Flow Simulation

Gap in resolution and model sizes

Gap in resolution: Geomodels: 107 − 109 cells Simulators: 105 − 106 cells − → sector models and/or upscaling of parameters

Coarse grid blocks:

⇓ ⇑

Flow problems:

Applied Mathematics 07/05/2009 4/26

slide-8
SLIDE 8

Flow Simulation

Gap in resolution and model sizes

Gap in resolution: Geomodels: 107 − 109 cells Simulators: 105 − 106 cells − → sector models and/or upscaling of parameters

⇓ ⇑

Coarse grid blocks:

⇓ ⇑

Flow problems:

Applied Mathematics 07/05/2009 4/26

slide-9
SLIDE 9

Flow Simulation

Gap in resolution and model sizes

Gap in resolution: Geomodels: 107 − 109 cells Simulators: 105 − 106 cells − → sector models and/or upscaling of parameters

⇓ ⇑

Coarse grid blocks:

⇓ ⇑

Flow problems:

Applied Mathematics 07/05/2009 4/26

slide-10
SLIDE 10

Flow Simulation

Gap in resolution and model sizes

Gap in resolution: Geomodels: 107 − 109 cells Simulators: 105 − 106 cells − → sector models and/or upscaling of parameters Many alternatives: Harmonic, arithmetic, geometric, . . . Local methods (K or T) Global methods Local-global methods Pseudo methods Ensemble methods Steady-state methods

⇓ ⇑

Coarse grid blocks:

⇓ ⇑

Flow problems:

Applied Mathematics 07/05/2009 4/26

slide-11
SLIDE 11

Simulation on Seismic/Geologic Grid

Why do we want/need it?

Upscaling: bottleneck in workflow loss of information/accuracy not sufficiently robust extensions to multiphase flow are somewhat shaky

Applied Mathematics 07/05/2009 5/26

slide-12
SLIDE 12

Simulation on Seismic/Geologic Grid

Why do we want/need it?

Upscaling: bottleneck in workflow loss of information/accuracy not sufficiently robust extensions to multiphase flow are somewhat shaky Simulation on seismic/geologic grid: best possible resolution of the physical processes faster mode building and history matching makes inversion a better instrument to find remaining oil better estimation of uncertainty by running alternative models

Applied Mathematics 07/05/2009 5/26

slide-13
SLIDE 13

Examples

North-Sea: Gullfaks field

Bypassed oil (4D inversion vs simulation)

Arnesen, WPC, Madrid, 2008 Applied Mathematics 07/05/2009 6/26

slide-14
SLIDE 14

Examples

Giant Middle-East field

Difference in resolution (10 million vs 1 billion cells)

From Dogru et al., SPE 119272 Applied Mathematics 07/05/2009 7/26

slide-15
SLIDE 15

Examples

Giant Middle-East field

Difference in resolution (10 million vs 1 billion cells)

From Dogru et al., SPE 119272 Applied Mathematics 07/05/2009 7/26

slide-16
SLIDE 16

Million-Cell Models on Desktop Computers

How to get there..?

Simplified flow physics “Full physics” is typically only required towards the end of a workflow

Applied Mathematics 07/05/2009 8/26

slide-17
SLIDE 17

Million-Cell Models on Desktop Computers

How to get there..?

Simplified flow physics “Full physics” is typically only required towards the end of a workflow Operator splitting fully coupled solution is slow.. subequations often have different time scales splitting opens up for tailor-made methods

Applied Mathematics 07/05/2009 8/26

slide-18
SLIDE 18

Streamline Simulation

Operator splitting + Euler-Lagrangian formulation

Applied Mathematics 07/05/2009 9/26

slide-19
SLIDE 19

Example

Two SPE 10 models

Producer A Producer B Producer C Producer D Injector Tarbert Upper Ness

Inhouse code: 60 × 220 × 85 = 1.1 million cells 2000 days of production from five-spot, 25 time steps Intel 2.4 GHz with 2 GB RAM: multigrid: 8 min 36 sec multiscale: 2 min 22 sec

Applied Mathematics 07/05/2009 10/26

slide-20
SLIDE 20

Example

Two SPE 10 models

Producer A Producer B Producer C Producer D Injector Tarbert Upper Ness

Inhouse code: 60 × 220 × 85 = 1.1 million cells 2000 days of production from five-spot, 25 time steps Intel 2.4 GHz with 2 GB RAM: multigrid: 8 min 36 sec multiscale: 2 min 22 sec FrontSim: 360 × 440 × 85 = 13.5 million cells Intel Xeon 5482, 64 Gb, 3.2 GHz Single thread, 13.5 Gb RAM Computing time: 1 h 55 min

Applied Mathematics 07/05/2009 10/26

slide-21
SLIDE 21

Fast Solution of Fluid Transport

Optimal ordering: finite volumes (almost) as fast as streamline simulation?

Natural order

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59

10 20 30 40 50 60 10 20 30 40 50 60 nz = 148

Topological order

1 2 6 11 18 33 35 36 3 4 5 10 17 32 34 37 38 7 8 9 16 31 39 40 41 12 13 14 15 30 45 46 47 48 19 20 21 29 44 51 52 53 22 23 26 28 43 50 55 56 57 24 25 27 42 49 54 58 59

10 20 30 40 50 60 10 20 30 40 50 60 nz = 148

Applied Mathematics 07/05/2009 11/26

slide-22
SLIDE 22

Example

Johansen formation, North Sea, potential CO2 injection site

Model: 50 × 50 × 1 km, rescaled by a factor 0.1 Grid: 27 437 active cells. saturation at 1000 days

Applied Mathematics 07/05/2009 12/26

slide-23
SLIDE 23

Example

Johansen formation: runtime/step and iterations/cell

∆t NR–UMFPACK NR–PFS NPFS days time (sec) iterations time (sec) iterations time (sec) iterations 125 2.26e+00 12.69 3.28e-01 12.69 4.44e-02 0.93 250 2.35e+00 12.62 3.32e-01 12.62 4.73e-02 1.10 500 2.38e+00 13.25 3.46e-01 13.25 4.16e-02 1.41 1000 2.50e+00 13.50 3.49e-01 13.50 4.21e-02 1.99 125 2.19e+00 12.69 3.91e-01 12.69 5.82e-02 1.33 250 2.02e+00 12.75 3.86e-01 12.75 6.07e-02 1.48 500 2.09e+00 13.25 3.90e-01 13.25 6.16e-02 1.79 1000 2.20e+00 14.00 4.11e-01 14.00 6.39e-02 2.38 incompressible oil compressible oil Time to compute reordering: 3.6 · 10−3 sec # cycles: 77.4 on average, involving 780 cells, 380 cells in largest cycle

Applied Mathematics 07/05/2009 13/26

slide-24
SLIDE 24

Example

Johansen formation: localization of nonlinear iterations

Applied Mathematics 07/05/2009 14/26

slide-25
SLIDE 25

Million-Cell Models on Desktop Computers

How to get there..?

Use of sparsity / (multiscale) structure effects resolved on different scales small changes from one step to next small changes from one simulation to next

Applied Mathematics 07/05/2009 15/26

slide-26
SLIDE 26

Million-Cell Models on Desktop Computers

How to get there..?

Use of sparsity / (multiscale) structure effects resolved on different scales small changes from one step to next small changes from one simulation to next Example: SPE10, Layer 36 Observations: Pressure on coarse grid Velocity on fine grid − → multiscale method

Applied Mathematics 07/05/2009 15/26

slide-27
SLIDE 27

Multiscale Pressure Solvers

From upscaling to multiscale methods

Standard upscaling:

⇓ ⇑

Coarse grid blocks:

⇓ ⇑

Flow problems:

Applied Mathematics 07/05/2009 16/26

slide-28
SLIDE 28

Multiscale Pressure Solvers

From upscaling to multiscale methods

Standard upscaling:

⇓ ⇑

Coarse grid blocks:

⇓ ⇑

Flow problems: Multiscale method:

Coarse grid blocks:

Flow problems:

Applied Mathematics 07/05/2009 16/26

slide-29
SLIDE 29

Multiscale Pressure Solvers

From upscaling to multiscale methods

Standard upscaling:

⇓ ⇑

Coarse grid blocks:

⇓ ⇑

Flow problems: Multiscale method:

Coarse grid blocks:

⇓ ⇑

Flow problems:

Applied Mathematics 07/05/2009 16/26

slide-30
SLIDE 30

Multiscale Pressure Solvers

From upscaling to multiscale methods

Standard upscaling:

⇓ ⇑

Coarse grid blocks:

⇓ ⇑

Flow problems: Multiscale method:

⇓ ⇑

Coarse grid blocks:

⇓ ⇑

Flow problems:

Applied Mathematics 07/05/2009 16/26

slide-31
SLIDE 31

Multiscale Pressure Solvers

Computation of multiscale basis functions

Ωi Ωj Ωij

Each cell Ωi: pressure basis φi Each face Γij: velocity basis ψij

  • ψij = −λK∇φij

∇ · ψij =      wi(x), x ∈ Ωi −wj(x), x ∈ Ωj 0,

  • therwise

Applied Mathematics 07/05/2009 17/26

slide-32
SLIDE 32

Multiscale Pressure Solvers

Computation of multiscale basis functions

Ωi Ωj Ωij

Each cell Ωi: pressure basis φi Each face Γij: velocity basis ψij

  • ψij = −λK∇φij

∇ · ψij =      wi(x), x ∈ Ωi −wj(x), x ∈ Ωj 0,

  • therwise

Homogeneous K: Heterogeneous K:

Applied Mathematics 07/05/2009 17/26

slide-33
SLIDE 33

Multiscale Pressure Solvers

Challenges posed by grids from real-life models

Unstructured grids: (Very) high aspect ratios:

800 × 800 × 0.25 m

Skewed and degenerate cells: Non-matching cells:

Applied Mathematics 07/05/2009 18/26

slide-34
SLIDE 34

Multiscale Pressure Solvers

Challenges posed by grids from real-life models

Unstructured grids: (Very) high aspect ratios:

800 × 800 × 0.25 m

Skewed and degenerate cells: Non-matching cells: Meeting the challenges: Automated coarsening algorithms Multipoint/mimetic fine-grid discretization

Applied Mathematics 07/05/2009 18/26

slide-35
SLIDE 35

Multiscale Pressure Solvers

Workflow with automated upgridding in 3D

1) Automated coarsening: uniform partition in index space for corner-point grids

44 927 cells ↓ 148 blocks 9 different coarse blocks

3) Compute basis functions ∇·ψij = ( wi(x), −wj(x), for all pairs of blocks 2) Detect all adjacent blocks 4) Block in coarse grid: component for building global solution

Applied Mathematics 07/05/2009 19/26

slide-36
SLIDE 36

Multiscale Pressure Solvers

Key to effiency: reuse of computations

Computational cost consists of: basis functions (fine grid) global problem (coarse grid) Example: 1283 grid

# operations versus upscaling factor

8x8x8 16x16x16 32x32x32 64x64x64

1 2 3 4 5 6 7 8 x 10

7

Basis functions Global system Fine scale solution (AMG) O(n ) 1.2

Applied Mathematics 07/05/2009 20/26

slide-37
SLIDE 37

Multiscale Pressure Solvers

Key to effiency: reuse of computations

Computational cost consists of: basis functions (fine grid) global problem (coarse grid) Full simulation: O(102) time steps High efficiency for multiphase flows: Elliptic decomposition Reuse basis functions Easy to parallelize Example: 1283 grid

# operations versus upscaling factor

8x8x8 16x16x16 32x32x32 64x64x64

1 2 3 4 5 6 7 8 x 10

7

Basis functions Global system Fine scale solution (AMG) O(n ) 1.2

Applied Mathematics 07/05/2009 20/26

slide-38
SLIDE 38

Example

10th SPE Comparative Solution Project

Fine grid: 60 × 220 × 85 Coarse grid: 5 × 11 × 17 2000 days production

.

Producer A Producer B Producer C Producer D Injector Tarbert Upper Ness

Applied Mathematics 07/05/2009 21/26

slide-39
SLIDE 39

Example

10th SPE Comparative Solution Project

Applied Mathematics 07/05/2009 21/26

slide-40
SLIDE 40

Example

10th SPE Comparative Solution Project

200 400 600 800 1000 1200 1400 1600 1800 2000 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 Nonpseudo upscaling fine grid Chevron Geoquest Roxar StreamSim TotalFinaElf 200 400 600 800 1000 1200 1400 1600 1800 2000 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 Pseudo−based upscaling fine grid Coates Landmark Phillips

Applied Mathematics 07/05/2009 21/26

slide-41
SLIDE 41

Example

10th SPE Comparative Solution Project

Water-cut curves at the four producers

500 1000 1500 2000 0.2 0.4 0.6 0.8 1 Time (days) Watercut Producer A 500 1000 1500 2000 0.2 0.4 0.6 0.8 1 Time (days) Watercut Producer B 500 1000 1500 2000 0.2 0.4 0.6 0.8 1 Time (days) Watercut Producer C 500 1000 1500 2000 0.2 0.4 0.6 0.8 1 Time (days) Watercut Producer D Reference MsMFEM Nested Gridding Reference MsMFEM Nested Gridding Reference MsMFEM Nested Gridding Reference MsMFEM Nested Gridding

upscaling/downscaling, multiscale, fine grid

Applied Mathematics 07/05/2009 21/26

slide-42
SLIDE 42

Example

History-matching million-cell model

Assimilation of production data to calibrate model 1 million cells, 32 injectors, and 69 producers 2475 days ≈ 7 years of water-cut data

Generalized travel-time inversion (quasi-linearization of misfit functional) with analytical sensitivities along streamlines

CPU-time (wall clock) Solver Total Pres. Transp. Multigrid 39 min 30 min 5 min Multiscale 17 min 7 min 6 min

Computer: 2.4 GHz Core 2 Duo, with 2 GB RAM History match: 7 forward simulations, 6 inversions Notice: obvious potential for parallelization of basis functions, streamline tracing and 1D transport solves not utilized

Applied Mathematics 07/05/2009 22/26

slide-43
SLIDE 43

Adaptive Model Reduction of Transport Grids

Flow-based nonuniform coarsening

1

Segment the domain according to ln | v|

2

Combine small blocks

3

Split blocks with too large flow

4

Combine small blocks SPE 10, Layer 37

Applied Mathematics 07/05/2009 23/26

slide-44
SLIDE 44

Example

Production data for a real-field model

0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Pore volume injected Water−cut curves

Reference solution 1648 blocks 891 blocks 469 blocks 236 blocks 121 blocks

0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Pore volume injected Water−cut curves

Reference solution 1581 blocks 854 blocks 450 blocks 239 blocks 119 blocks

Applied Mathematics 07/05/2009 24/26

slide-45
SLIDE 45

Summary

Keys to enable fast simulation on seismic/geological grids: Simplified physics Operator splitting Sparsity / (multiscale) structure In the future: fit-for-purpose rather than one-simulator-solves-all..?

Applied Mathematics 07/05/2009 25/26

slide-46
SLIDE 46

Current and Future Research

Flow Physics Geological representation

Coarse Detailed Simple Complex “GeoScale” technology Commercial simulators Fast reservoir simulator Super-fast lightweight simulation

  • Split fine / coarse scales
  • Very fast
  • Near-well modeling

Large-scale simulation

  • Parallelization
  • Multimillion

reservoir cells

  • Support for time-critical

processes

  • Optimal model reduction for

tradeoff between time and accuracy

Applied Mathematics 07/05/2009 26/26