A Benchmark for Inpainting-Based 5 Image Reconstruction and - - PowerPoint PPT Presentation

a benchmark for inpainting based
SMART_READER_LITE
LIVE PREVIEW

A Benchmark for Inpainting-Based 5 Image Reconstruction and - - PowerPoint PPT Presentation

M I Dagstuhl Seminar on Inpainting-Based Image Compression A Dagstuhl, November 13 18, 2016 1 2 3 4 A Benchmark for Inpainting-Based 5 Image Reconstruction and Compression 6 7 8 9 Sarah Andris, Leif Bergerhoff, and Joachim


slide-1
SLIDE 1

Dagstuhl Seminar on Inpainting-Based Image Compression Dagstuhl, November 13 – 18, 2016

M I

A

A Benchmark for Inpainting-Based Image Reconstruction and Compression

Sarah Andris, Leif Bergerhoff, and Joachim Weickert Mathematical Image Analysis Group Faculty of Mathematics and Computer Science Saarland University, Saarbr¨ ucken, Germany 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

slide-2
SLIDE 2

Motivation

M I

A

Motivation

  • Benchmarks can encourage research advances by allowing fair comparison.
  • classical examples in computer vision:

Middlebury Optical Flow Benchmark Berkeley Segmentation Benchmark

  • inpainting test set with 100 images: Kawai et al. 2009
  • Our goal: Introduce such a benchmark for inpainting-based reconstruction

and compression. What features should we include? 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

slide-3
SLIDE 3

Outline

M I

A

Outline

  • Motivation
  • Structure
  • Data Optimisation
  • Optimal Reconstruction and Complete Codecs
  • Error Measures
  • Conclusions

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

slide-4
SLIDE 4

Structure (1)

M I

A

Structure

  • Provide test sets, code for basic methods, and rankings.
  • Layer 1: Data Optimisation
  • mask positions
  • grey / colour values
  • Layer 2: Optimal Reconstruction
  • mask positions
  • grey / colour values
  • inpainting operator
  • Layer 3: Complete Codecs
  • mask generation and grey / colour value optimisation
  • optimal inpainting operator
  • quantisation methods
  • storage and coding of acquired data

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

slide-5
SLIDE 5

Structure (2)

M I

A

Test Sets

  • currently three test sets:

SmallGrey SmallColour LargeColour · · ·

  • small sets: 5 well-known images

large set: 24 images of the Kodak database

  • The Kodak database has been released for unrestricted usage.
  • We might have to separate into training and test set.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

slide-6
SLIDE 6

Structure (3)

M I

A

Possible Extensions

  • Generate own images tailored towards inpainting-specific challenges:
  • texture,
  • cartoon,
  • · · ·
  • Include high resolution images:
  • 4k,
  • 1080p,
  • · · ·
  • Add data sets for special image types:
  • high dynamic range (HDR) images,
  • spectral images,
  • medical images,
  • depth maps,
  • · · ·

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

slide-7
SLIDE 7

Data Optimisation (1)

M I

A

Data Optimisation

  • Users can upload masks and corresponding grey or colour values for every test set.
  • Homogeneous diffusion inpainting is performed automatically.
  • Permitted masks contain 3%, 5%, or 10% of total amount of image pixels.
  • We measure the quality of reconstructed images in terms of MSE and SSIM.

Original Mask Colour Values Reconstruction

  • keep it simple: only binary masks and integer values

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

slide-8
SLIDE 8

Data Optimisation (2)

M I

A

Possible Extensions

  • Provide additional inpainting methods:
  • biharmonic, EED (Weickert 1996), exemplar-based (Facciolo et al. 2009), ...
  • Problem: Reconstruction quality may depend on implementation.
  • Including methods from users is a security risk and therefore not planned.

Two Showcase Methods

  • Method 1: Dithering on Laplace magnitude (Belhachmi et al. 2009)
  • Laplace magnitude indicates good mask positions
  • Realise dithering through electrostatic halftoning (Schmaltz et al. 2010).
  • Method 2: Probabilistic sparsification (Mainberger et al. 2012)
  • Randomly remove points from full mask.
  • Add back certain percentage with large inpainting error.
  • Repeat until desired density is reached.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

slide-9
SLIDE 9

Data Optimisation (3)

M I

A

Results Dithering 5% MSE: 60.7 3% MSE: 148.4 Sparsification 5% MSE: 55.7 3% MSE: 146.3 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

slide-10
SLIDE 10

Optimal Reconstruction and Complete Codecs (1)

M I

A

Optimal Reconstruction and Complete Codecs

  • Layer 2: Optimal Reconstruction
  • mask generation and grey / colour value optimisation
  • optimal inpainting operator

Task: Given a mask density, produce the best reconstruction.

  • Layer 3: Complete Codecs
  • mask generation and grey / colour value optimisation
  • optimal inpainting operator
  • quantisation methods
  • storage and coding of acquired data

Task: Given a compression rate, produce the best reconstruction.

  • Efficiency can only be judged by the outcome of the complete process.
  • Acquire error of reconstructed image w.r.t. original.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

slide-11
SLIDE 11

Optimal Reconstruction and Complete Codecs (2)

M I

A

Verification

  • Verification through hidden ground truth is not possible.
  • Should we check for fraud by verifying programs?

Verification + fraud becomes much harder (but not impossible) – security risk – high maintenance – restrictions on submitted code No Verification – fraud may be undetected + no attacks through malicious programs + low maintenance + no restrictions necessary

  • Verifying programs is risky, burdensome, and error-prone.
  • Users can still supply a link to their code.
  • Require corresponding paper as a reference.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

slide-12
SLIDE 12

Error Measures (1)

M I

A

Error Measures

  • Choosing an appropriate error measure is essential for a benchmark.
  • Most benchmarks present several measures or use a combination.
  • very well-known measure: MSE (or equivalently PSNR)
  • Problem: often does not reflect human perception

Remedy: error measures taking structure / statistics into account

  • Structural similarity measure (SSIM) has become popular.
  • Is the SSIM really superior?
  • Dosselmann and Yang 2011: strong correlation between localised MSE and SSIM
  • Gali´

c et al. 2012: MSE and SSIM behave similarly in compression context 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

slide-13
SLIDE 13

Error Measures (2)

M I

A

MSE versus SSIM

  • Hypersphere: Apply image distortions resulting in similar MSE / SSIM.
  • Example: MSE ≈ 200, SSIM ≈ 0.7 (image courtesy: P. Peter)

Blur Noise JPEG Value Shift

  • Here, both measures do not represent human perception well.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

slide-14
SLIDE 14

Error Measures (3)

M I

A

Texture and Colour

  • Judging the quality of texture is particularly challenging.

Original MSE: 13136.9 MSE: 7479.2 SSIM: 0.25 SSIM: 0.48

  • Possible remedy: texture error measure (e.g. ˇ

Zujovi´ c et al. 2012)

  • Error measures for colour images suffer similar problems.
  • Choosing an error measure influences the design of methods.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

slide-15
SLIDE 15

Conclusions

M I

A

Conclusions

  • We propose a benchmark with three layers of optimisation:
  • Layer 1: Data Optimisation
  • Layer 2: Optimal Reconstruction
  • Layer 3: Complete Codecs
  • Data optimisation can already be tested on the website.
  • The benchmark website is intended as a service to our research community.
  • The choice of appropriate error measures is up to discussion.

Thank you! http://benchmark.mia.uni-saarland.de 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

slide-16
SLIDE 16
slide-17
SLIDE 17
slide-18
SLIDE 18
slide-19
SLIDE 19
slide-20
SLIDE 20
slide-21
SLIDE 21
slide-22
SLIDE 22
slide-23
SLIDE 23
slide-24
SLIDE 24
slide-25
SLIDE 25
slide-26
SLIDE 26
slide-27
SLIDE 27
slide-28
SLIDE 28
slide-29
SLIDE 29
slide-30
SLIDE 30
slide-31
SLIDE 31
slide-32
SLIDE 32
slide-33
SLIDE 33
slide-34
SLIDE 34
slide-35
SLIDE 35
slide-36
SLIDE 36
slide-37
SLIDE 37
slide-38
SLIDE 38
slide-39
SLIDE 39
slide-40
SLIDE 40
slide-41
SLIDE 41
slide-42
SLIDE 42
slide-43
SLIDE 43
slide-44
SLIDE 44
slide-45
SLIDE 45
slide-46
SLIDE 46
slide-47
SLIDE 47
slide-48
SLIDE 48
slide-49
SLIDE 49
slide-50
SLIDE 50
slide-51
SLIDE 51
slide-52
SLIDE 52
slide-53
SLIDE 53
slide-54
SLIDE 54
slide-55
SLIDE 55