Multiscale Methods for Capturing Geological Heterogeneity Stein - - PowerPoint PPT Presentation

multiscale methods for capturing geological heterogeneity
SMART_READER_LITE
LIVE PREVIEW

Multiscale Methods for Capturing Geological Heterogeneity Stein - - PowerPoint PPT Presentation

Multiscale Methods for Capturing Geological Heterogeneity Stein Krogstad and KnutAndreas Lie SINTEF ICT, Dept. Applied Mathematics Rijswijk, May 3 2010 Applied Mathematics 03/05/2010 1/16 Physical Scales in Subsurface Modelling The


slide-1
SLIDE 1

Multiscale Methods for Capturing Geological Heterogeneity

Stein Krogstad and Knut–Andreas Lie

SINTEF ICT, Dept. Applied Mathematics

Rijswijk, May 3 2010

Applied Mathematics 03/05/2010 1/16

slide-2
SLIDE 2

Physical Scales in Subsurface Modelling

The scales that impact fluid flow in oil reservoirs range from the micrometer scale of pores and pore channels via dm–m scale of well bores and laminae sediments to sedimentary structures that stretch across entire reservoirs.

Applied Mathematics 03/05/2010 2/16

slide-3
SLIDE 3

Geological Models

Expressing the geologists’ preception of the reservoir: here: geo-cellular models describe the reservoir geometry (horizons, faults, etc) typically generated using geostatistics (or process simulation) give rock parameters (permeability and porosity)

Applied Mathematics 03/05/2010 3/16

slide-4
SLIDE 4

Geological Models

Expressing the geologists’ preception of the reservoir: here: geo-cellular models describe the reservoir geometry (horizons, faults, etc) typically generated using geostatistics (or process simulation) give rock parameters (permeability and porosity) Rock parameters: have a multiscale structure details on all scales impact flow permeability spans many orders of magnitude

Ex: Brent sequence Tarbert Upper Ness Applied Mathematics 03/05/2010 3/16

slide-5
SLIDE 5

Heterogeneity versus Flow Modelling

Gap in resolution: Geomodels: 107 − 109 cells Simulators: 105 − 106 cells

Applied Mathematics 03/05/2010 4/16

slide-6
SLIDE 6

Heterogeneity versus Flow Modelling

Gap in resolution: Geomodels: 107 − 109 cells Simulators: 105 − 106 cells − → sector models and/or upscaling of parameters

Coarse grid blocks:

Flow problems:

Applied Mathematics 03/05/2010 4/16

slide-7
SLIDE 7

Heterogeneity versus Flow Modelling

Gap in resolution: Geomodels: 107 − 109 cells Simulators: 105 − 106 cells − → sector models and/or upscaling of parameters

Coarse grid blocks:

⇓ ⇑

Flow problems:

Applied Mathematics 03/05/2010 4/16

slide-8
SLIDE 8

Heterogeneity versus Flow Modelling

Gap in resolution: Geomodels: 107 − 109 cells Simulators: 105 − 106 cells − → sector models and/or upscaling of parameters

⇓ ⇑

Coarse grid blocks:

⇓ ⇑

Flow problems:

Applied Mathematics 03/05/2010 4/16

slide-9
SLIDE 9

Heterogeneity versus Flow Modelling

Gap in resolution: Geomodels: 107 − 109 cells Simulators: 105 − 106 cells − → sector models and/or upscaling of parameters

⇓ ⇑

Coarse grid blocks:

⇓ ⇑

Flow problems:

Applied Mathematics 03/05/2010 4/16

slide-10
SLIDE 10

Heterogeneity versus Flow Modelling

Gap in resolution: Geomodels: 107 − 109 cells Simulators: 105 − 106 cells − → sector models and/or upscaling of parameters

Many alternatives: Harmonic, arithmetic, geometric, . . . Local methods (K or T) Global methods Local-global methods Pseudo methods Ensemble methods Steady-state methods

⇓ ⇑

Coarse grid blocks:

⇓ ⇑

Flow problems:

Applied Mathematics 03/05/2010 4/16

slide-11
SLIDE 11

Simulation on Seismic/Geologic Grid

Why do we want/need it? Upscaling is a bottleneck in workflow, gives loss of information/accuracy, is not sufficiently robust, extensions to multiphase flow are somewhat shaky

Applied Mathematics 03/05/2010 5/16

slide-12
SLIDE 12

Simulation on Seismic/Geologic Grid

Why do we want/need it? Upscaling is a bottleneck in workflow, gives loss of information/accuracy, is not sufficiently robust, extensions to multiphase flow are somewhat shaky Simulation on seismic/geologic grid: best possible resolution of the physical processes, faster model building and history matching, makes inversion a better instrument to find remaining oil, better estimation of uncertainty by running alternative models

Applied Mathematics 03/05/2010 5/16

slide-13
SLIDE 13

Example: Gullfaks Field (North Sea)

Bypassed oil (4D inversion vs simulation):

Arnesen, WPC, Madrid, 2008 Applied Mathematics 03/05/2010 6/16

slide-14
SLIDE 14

Example: Giant Middle-East Field

Difference in resolution (10 million vs 1 billion cells):

From Dogru et al., SPE 119272 Applied Mathematics 03/05/2010 6/16

slide-15
SLIDE 15

Example: Giant Middle-East Field

Difference in resolution (10 million vs 1 billion cells):

From Dogru et al., SPE 119272 Applied Mathematics 03/05/2010 6/16

slide-16
SLIDE 16

How to Close the Resolution Gap. . . ?

Simplified flow physics: Can often tell a lot about the fluid movement. “Full physics” is typically only required towards the end of a workflow

Applied Mathematics 03/05/2010 7/16

slide-17
SLIDE 17

How to Close the Resolution Gap. . . ?

Simplified flow physics: Can often tell a lot about the fluid movement. “Full physics” is typically only required towards the end of a workflow Operator splitting: fully coupled solution is slow.. subequations often have different time scales splitting opens up for tailor-made methods

Applied Mathematics 03/05/2010 7/16

slide-18
SLIDE 18

How to Close the Resolution Gap. . . ?

Simplified flow physics: Can often tell a lot about the fluid movement. “Full physics” is typically only required towards the end of a workflow Operator splitting: fully coupled solution is slow.. subequations often have different time scales splitting opens up for tailor-made methods Use of sparsity / (multiscale) structure: effects resolved on different scales small changes from one step to next small changes from one simulation to next

Applied Mathematics 03/05/2010 7/16

slide-19
SLIDE 19

From Upscaling to Multiscale Pressure Solvers

Standard upscaling:

⇓⇑

Coarse grid blocks:

⇓⇑

Flow problems: Multiscale method:

Coarse grid blocks:

⇓⇑

Flow problems:

Applied Mathematics 03/05/2010 8/16

slide-20
SLIDE 20

From Upscaling to Multiscale Pressure Solvers

Standard upscaling:

⇓⇑

Coarse grid blocks:

⇓⇑

Flow problems: Multiscale method:

⇓ ⇑

Coarse grid blocks:

⇓⇑

Flow problems:

Applied Mathematics 03/05/2010 8/16

slide-21
SLIDE 21

Workflow with Automated Upgridding in 3D

1) Automated coarsening: uniform partition in index space for corner-point grids

44 927 cells ↓ 148 blocks 9 different coarse blocks Applied Mathematics 03/05/2010 9/16

slide-22
SLIDE 22

Workflow with Automated Upgridding in 3D

1) Automated coarsening: uniform partition in index space for corner-point grids

44 927 cells ↓ 148 blocks 9 different coarse blocks

2) Detect all adjacent blocks

Applied Mathematics 03/05/2010 9/16

slide-23
SLIDE 23

Workflow with Automated Upgridding in 3D

1) Automated coarsening: uniform partition in index space for corner-point grids

44 927 cells ↓ 148 blocks 9 different coarse blocks

3) Compute basis functions ∇·ψij = ( wi(x), −wj(x), for all pairs of blocks 2) Detect all adjacent blocks

Applied Mathematics 03/05/2010 9/16

slide-24
SLIDE 24

Workflow with Automated Upgridding in 3D

1) Automated coarsening: uniform partition in index space for corner-point grids

44 927 cells ↓ 148 blocks 9 different coarse blocks

3) Compute basis functions ∇·ψij = ( wi(x), −wj(x), for all pairs of blocks 2) Detect all adjacent blocks 4) Block in coarse grid: component for building global solution

Applied Mathematics 03/05/2010 9/16

slide-25
SLIDE 25

Multiscale Methods: Potential

More flexible wrt grids than standard upscaling methods: automatic coarsening

Applied Mathematics 03/05/2010 10/16

slide-26
SLIDE 26

Multiscale Methods: Potential

More flexible wrt grids than standard upscaling methods: automatic coarsening Reuse of computations, key to computational efficiency Operations vs. upscaling factor:

8x8x8 16x16x16 32x32x32 64x64x64

1 2 3 4 5 6 7 8 x 10

7

Basis functions Global system Fine scale solution (AMG) O(n ) 1.2

SPE10: 1.1 mill cells

Inhouse code from 2005: Multiscale: 2 min and 20 sec Multigrid: 8 min and 36 sec Applied Mathematics 03/05/2010 10/16

slide-27
SLIDE 27

Multiscale Methods: Potential

More flexible wrt grids than standard upscaling methods: automatic coarsening Reuse of computations, key to computational efficiency Natural (elliptic) parallelism: giga-cell simulations multicore and heterogeneous computing

Applied Mathematics 03/05/2010 10/16

slide-28
SLIDE 28

Multiscale Methods: Potential

More flexible wrt grids than standard upscaling methods: automatic coarsening Reuse of computations, key to computational efficiency Natural (elliptic) parallelism: giga-cell simulations multicore and heterogeneous computing Fine-scale velocity − → different grid for flow and transport − → dynamical adaptivity Pressure grid: Transport grid:

Applied Mathematics 03/05/2010 10/16

slide-29
SLIDE 29

Multiscale Methods: Potential

More flexible wrt grids than standard upscaling methods: automatic coarsening Reuse of computations, key to computational efficiency Natural (elliptic) parallelism: giga-cell simulations multicore and heterogeneous computing Fine-scale velocity − → different grid for flow and transport − → dynamical adaptivity Flow-based gridding:

with and without dynamic Cartesian refinement Research by: Vera Louise Hauge, Shell scholarship Applied Mathematics 03/05/2010 10/16

slide-30
SLIDE 30

Multiscale Methods: Potential

More flexible wrt grids than standard upscaling methods: automatic coarsening Reuse of computations, key to computational efficiency Natural (elliptic) parallelism: giga-cell simulations multicore and heterogeneous computing Fine-scale velocity − → different grid for flow and transport − → dynamical adaptivity Method for model reduction: adjoint simulations − → approximate gradients ensemble simulations with representative basis functions Water-flood optimization:

Reservoir geometry from a Norwegian Sea field

2 4 6 8 10 2 4 6 8 10 12 14 x 10

6

Time [years] Cum.Prod. [m3] Oil initial Oil optimized Water initial Water optimized

Forward simulations: 44 927 cells, 20 time steps, < 5 sec in Matlab Applied Mathematics 03/05/2010 10/16

slide-31
SLIDE 31

Multiscale Methods: Potential

More flexible wrt grids than standard upscaling methods: automatic coarsening Reuse of computations, key to computational efficiency Natural (elliptic) parallelism: giga-cell simulations multicore and heterogeneous computing Fine-scale velocity − → different grid for flow and transport − → dynamical adaptivity Method for model reduction: adjoint simulations − → approximate gradients ensemble simulations with representative basis functions History matching 1 million cells: 7 years: 32 injectors, 69 producers

Generalized travel-time inversion + multiscale: 7 forward simulations, 6 inversions CPU-time (wall clock) Solver Total Pres. Transp. Multigrid 39 min 30 min 5 min Multiscale 17 min 7 min 6 min Applied Mathematics 03/05/2010 10/16

slide-32
SLIDE 32

Multiscale Methods: Potential

More flexible wrt grids than standard upscaling methods: automatic coarsening Reuse of computations, key to computational efficiency Natural (elliptic) parallelism: giga-cell simulations multicore and heterogeneous computing Fine-scale velocity − → different grid for flow and transport − → dynamical adaptivity Method for model reduction: adjoint simulations − → approximate gradients ensemble simulations with representative basis functions Multiphysics applications Stokes–Brinkmann:

Applied Mathematics 03/05/2010 10/16

slide-33
SLIDE 33

Multiscale Methods: Limitations

How well do these methods handle complex physics? No fully-implicit formulation available Compressibility, gravity, . . . − → correction functions Strong coupling − → more iterations and updates of basis and correction functions To force residual to zero, multiscale methods start to look like multigrid/domain decomposition Not yet applied to compositional/thermal/...

Applied Mathematics 03/05/2010 11/16

slide-34
SLIDE 34

Multiscale Methods: Limitations

Other issues: How to choose good coarse grids for unstructured grids? Need for global information or iterative procedures? A posteriori error analysis (resolution or fine-scale junk)? More than two levels in hierarchical grid? How to include pore-scale models?

Applied Mathematics 03/05/2010 11/16

slide-35
SLIDE 35

Example: Fracture Corridors

800 × 800 80 × 80 upscaled 80 × 80 multiscale

Applied Mathematics 03/05/2010 12/16

slide-36
SLIDE 36

Example: SPE10 with Fracture Corridors

x-y permeability saturation, reference saturation, multiscale

Applied Mathematics 03/05/2010 13/16

slide-37
SLIDE 37

Example: SPE10 with Fracture Corridors

field oil-production rate field water cut best well: water cut in P1 worst well: water cut in P8 Applied Mathematics 03/05/2010 13/16

slide-38
SLIDE 38

Efficient Transport Solvers

Streamline methods intuitive visualization + new data subscale resolution good scaling, known to be efficient Flow pattern (CO2 injection): Connections across faults:

Applied Mathematics 03/05/2010 14/16

slide-39
SLIDE 39

Efficient Transport Solvers

Streamline methods intuitive visualization + new data subscale resolution good scaling, known to be efficient Time-of-flight (timelines): Flooded volumes (stationary tracer):

Applied Mathematics 03/05/2010 14/16

slide-40
SLIDE 40

Efficient Transport Solvers

Streamline methods intuitive visualization + new data subscale resolution good scaling, known to be efficient Synthetic example in FrontSim: 13.5 million cells Intel Xeon, 64 GiB, 3.2 GHz Single thread, 13.5 GiB RAM Runtime: 1 h 55 min

Applied Mathematics 03/05/2010 14/16

slide-41
SLIDE 41

Efficient Transport Solvers

Streamline methods intuitive visualization + new data subscale resolution good scaling, known to be efficient Optimal ordering same assumptions as for streamlines utilize causality − → O(n) algorithm, cell-by-cell solution local control over (non)linear iterations Topological sorting natural order

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59

topological order

1 2 6 11 18 33 35 36 3 4 5 10 17 32 34 37 38 7 8 9 16 31 39 40 41 12 13 14 15 30 45 46 47 48 19 20 21 29 44 51 52 53 22 23 26 28 43 50 55 56 57 24 25 27 42 49 54 58 59 Applied Mathematics 03/05/2010 14/16

slide-42
SLIDE 42

Efficient Transport Solvers

Streamline methods intuitive visualization + new data subscale resolution good scaling, known to be efficient Optimal ordering same assumptions as for streamlines utilize causality − → O(n) algorithm, cell-by-cell solution local control over (non)linear iterations Local iterations:

Johansen formation: 27 437 active cells Global vs local Newton–Raphson solver ∆t global local days time iter time (sec) iter 125 2.26 12.69 0.044 0.93 250 2.35 12.62 0.047 1.10 500 2.38 13.25 0.042 1.41 1000 2.50 13.50 0.042 1.99 Applied Mathematics 03/05/2010 14/16

slide-43
SLIDE 43

Efficient Transport Solvers

Streamline methods intuitive visualization + new data subscale resolution good scaling, known to be efficient Optimal ordering same assumptions as for streamlines utilize causality − → O(n) algorithm, cell-by-cell solution local control over (non)linear iterations Flow-based coarsening agglomeration of cells − → simple and flexible coarsening hybrid griding schemes heterogeneous multiscale method? efficient model reduction Cartesian grid: Triangular grids:

Applied Mathematics 03/05/2010 14/16

slide-44
SLIDE 44

Efficient Transport Solvers

Streamline methods intuitive visualization + new data subscale resolution good scaling, known to be efficient Optimal ordering same assumptions as for streamlines utilize causality − → O(n) algorithm, cell-by-cell solution local control over (non)linear iterations Flow-based coarsening agglomeration of cells − → simple and flexible coarsening hybrid griding schemes heterogeneous multiscale method? efficient model reduction Different partitioning:

Uniform coarsening + NUC refinement Uniform coarsening + Cartesian/NUC refinement

Applied Mathematics 03/05/2010 14/16

slide-45
SLIDE 45

Efficient Transport Solvers

Streamline methods intuitive visualization + new data subscale resolution good scaling, known to be efficient Optimal ordering same assumptions as for streamlines utilize causality − → O(n) algorithm, cell-by-cell solution local control over (non)linear iterations Flow-based coarsening agglomeration of cells − → simple and flexible coarsening hybrid griding schemes heterogeneous multiscale method? efficient model reduction Model reduction by coarsening:

0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Pore volume injected Water−cut curves

Reference solution 1581 blocks 854 blocks 450 blocks 239 blocks 119 blocks

Applied Mathematics 03/05/2010 14/16

slide-46
SLIDE 46

Summary

Keys to enable fast simulation on seismic/geological grids: Simplified physics Operator splitting Sparsity / (multiscale) structure In the future: fit-for-purpose rather than one-simulator-solves-all..?

Applied Mathematics 03/05/2010 15/16

slide-47
SLIDE 47

Current and Future Research

Flow Physics Geological representation

Coarse Detailed Simple Complex “GeoScale” technology Commercial simulators Fast reservoir simulator Super-fast lightweight simulation

  • Split fine / coarse scales
  • Very fast
  • Near-well modeling

Large-scale simulation

  • Parallelization
  • Multimillion

reservoir cells

  • Support for time-critical

processes

  • Optimal model reduction for

tradeoff between time and accuracy

Applied Mathematics 03/05/2010 16/16