Today Non-linear filtering example Median filter Replace each - - PowerPoint PPT Presentation

today non linear filtering example median filter
SMART_READER_LITE
LIVE PREVIEW

Today Non-linear filtering example Median filter Replace each - - PowerPoint PPT Presentation

Today Non-linear filtering example Median filter Replace each pixel by the median over N pixels (5 pixels, for these examples). Generalizes to rank order filters. Spike In: Out: noise is removed 5-pixel neighborhood Monotonic


slide-1
SLIDE 1
slide-2
SLIDE 2

Today

slide-3
SLIDE 3

Non-linear filtering example

slide-4
SLIDE 4

Median filter

Replace each pixel by the median over N pixels (5 pixels, for these examples). Generalizes to “rank order” filters. Spike noise is removed In: Out:

5-pixel neighborhood

Monotonic edges remain unchanged Out: In:

slide-5
SLIDE 5

Degraded image

slide-6
SLIDE 6

Radius 1 median filter

Because the filter is non-linear, it has the opportunity to remove the scratch noise without blurring edges.

slide-7
SLIDE 7

Radius 2 median filter

slide-8
SLIDE 8

Comparison with linear blur of the amount needed to remove the scratches

slide-9
SLIDE 9

CCD color sampling

slide-10
SLIDE 10

Color sensing, 3 approaches

  • Scan 3 times (temporal multiplexing)
  • Use 3 detectors (3-ccd camera, and color

film)

  • Use offset color samples (spatial

multiplexing)

slide-11
SLIDE 11

Typical errors in temporal multiplexing approach

Color offset fringes

slide-12
SLIDE 12

Typical errors in spatial multiplexing approach.

Color fringes.

slide-13
SLIDE 13

CCD color filter pattern

detector

slide-14
SLIDE 14

The cause of color moire

detector Fine black and white detail in image mis-interpreted as color information.

slide-15
SLIDE 15

Black and white edge falling on color CCD detector

Black and white image (edge) Detector pixel colors

slide-16
SLIDE 16

Color sampling artifacts

Interpolated pixel colors, for grey edge falling on colored detectors (linear interpolation). The edge is aliased (undersampled) in the samples of any one color. That aliasing manifests itself in the spatial domain as an incorrect estimate of the precise position of the edge. That disagreement about the position of the edge results in a color fringe artifact.

The mis-estimated edge yields color fringe artifacts. The response of independently interpolated color bands to an edge. A sharp luminance edge.

slide-17
SLIDE 17

Typical color moire patterns

Blow-up of electronic camera

  • image. Notice spurious

colors in the regions

  • f fine detail in the

plants.

slide-18
SLIDE 18

Color sampling artifacts

slide-19
SLIDE 19

Human Photoreceptors

(From Foundations of Vision, by Brian Wandell, Sinauer Assoc.)

slide-20
SLIDE 20

Brewster’s colors example (subtle).

Scale relative to human photoreceptor size: each line covers about 7 photoreceptors.

slide-21
SLIDE 21

Median Filter Interpolation

1) Perform first interpolation on isolated color channels. 2) Compute color difference signals. 3) Median filter the color difference signal. 4) Reconstruct the 3-color image.

slide-22
SLIDE 22

Two-color sampling of BW edge

Sampled data Linear interpolation Color difference signal Median filtered color difference signal

slide-23
SLIDE 23

R-G, after linear interpolation

slide-24
SLIDE 24

R – G, median filtered (5x5)

slide-25
SLIDE 25

Recombining the median filtered colors

Linear interpolation Median filter interpolation

slide-26
SLIDE 26

References on color interpolation

  • Brainard
  • Shree nayar.
slide-27
SLIDE 27

Image texture

slide-28
SLIDE 28

Texture

  • Key issue: representing texture

– Texture based matching

  • little is known

– Texture segmentation

  • key issue: representing texture

– Texture synthesis

  • useful; also gives some insight into quality of representation

– Shape from texture

  • cover superficially
slide-29
SLIDE 29

The Goal of Texture Synthesis

True (infinite) texture SYNTHESIS generated image input image

  • Given a finite sample of some texture, the

goal is to synthesize other samples from that same texture

– The sample needs to be "large enough“

slide-30
SLIDE 30

The Goal of Texture Analysis

True (infinite) texture ANALYSIS generated image input image

“Same” or “different”

Compare textures and decide if they’re made of the same “stuff”.

slide-31
SLIDE 31

Pre-attentive texture discrimination

slide-32
SLIDE 32

Pre-attentive texture discrimination

slide-33
SLIDE 33

Pre-attentive texture discrimination

Same or different textures?

slide-34
SLIDE 34

Pre-attentive texture discrimination

slide-35
SLIDE 35

Pre-attentive texture discrimination

slide-36
SLIDE 36

Pre-attentive texture discrimination

Same or different textures?

slide-37
SLIDE 37

Julesz

  • Textons: analyze the texture in terms of

statistical relationships between fundamental texture elements, called “textons”.

  • It generally required a human to look at

the texture in order to decide what those fundamental units were...

slide-38
SLIDE 38

Influential paper:

slide-39
SLIDE 39

Learn: use filters.

Bergen and Adelson, Nature 1988

slide-40
SLIDE 40

Malik and Perona

Learn: use lots of filters, multi-ori&scale. Malik J, Perona P. Preattentive texture discrimination with early vision

  • mechanisms. J OPT SOC AM A 7: (5) 923-

932 MAY 1990

slide-41
SLIDE 41

Representing textures

  • Textures are made up
  • f quite stylised

subelements, repeated in meaningful ways

  • Representation:

– find the subelements, and represent their statistics

  • But what are the

subelements, and how do we find them?

– recall normalized correlation – find subelements by applying filters, looking at the magnitude of the

  • What filters?

– experience suggests spots and oriented bars at a variety of different scales – details probably don’t matter

  • What statistics?

– within reason, the more the merrier. – At least, mean and standard deviation – better, various conditional histograms.

slide-42
SLIDE 42
slide-43
SLIDE 43

image

Squared responses Spatially blurred vertical filter horizontal filter Threshold squared, blurred responses, then categorize texture based on those two bits

slide-44
SLIDE 44
slide-45
SLIDE 45
slide-46
SLIDE 46
slide-47
SLIDE 47

SIGGRAPH 1994

slide-48
SLIDE 48

Show block diagram of heeger bergen

  • And demonstrate it working with matlab
  • code. Ask ted for example.
slide-49
SLIDE 49

Learn: use filter marginal statistics.

Bergen and Heeger

slide-50
SLIDE 50

Matlab examples

slide-51
SLIDE 51

Bergen and Heeger results

slide-52
SLIDE 52

Bergen and Heeger failures

slide-53
SLIDE 53

De Bonet (and Viola)

SIGGRAPH 1997

slide-54
SLIDE 54

DeBonet

Learn: use filter conditional statistics across scale.

slide-55
SLIDE 55

DeBonet

slide-56
SLIDE 56

DeBonet

slide-57
SLIDE 57

Portilla and Simoncelli

  • Parametric representation.
  • About 1000 numbers to describe a texture.
  • Ok results; maybe as good as DeBonet.
slide-58
SLIDE 58

Portilla and Simoncelli

slide-59
SLIDE 59

Zhu, Wu, & Mumford, 1998

  • Principled approach.
  • Synthesis quality not great, but ok.
slide-60
SLIDE 60

Zhu, Wu, & Mumford

  • Cheetah Synthetic
slide-61
SLIDE 61
slide-62
SLIDE 62

Efros and Leung

slide-63
SLIDE 63
slide-64
SLIDE 64

What we’ve learned from the previous texture synthesis methods

From Adelson and Bergen: examine filter outputs From Perona and Malik: use multi-scale, multi-orientation filters. From Heeger and Bergen: use marginal statistics (histograms) of filter responses. From DeBonet: use conditional filter responses across scale.

slide-65
SLIDE 65

What we learned from Efros and Leung regarding texture synthesis

  • Don’t need conditional filter responses

across scale

  • Don’t need marginal statistics of filter

responses.

  • Don’t need multi-scale, multi-orientation

filters.

  • Don’t need filters.
slide-66
SLIDE 66

Efros & Leung ’99

  • The algorithm

– Very simple – Surprisingly good results – Synthesis is easier than analysis! – …but very slow

  • Optimizations and Improvements

– [Wei & Levoy,’00] (based on [Popat & Picard,’93]) – [Harrison,’01] – [Ashikhmin,’01]

slide-67
SLIDE 67

Efros & Leung ’99 extended

p p

  • Observation: neighbor pixels are highly correlated

Input image

non-parametric sampling

B B Idea: Idea: unit of synthesis = block unit of synthesis = block

  • Exactly the same but now we want P(B| N(B))
  • Much faster: synthesize all pixels in a block at once
  • Not the same as multi-scale!

Synthesizing a block

slide-68
SLIDE 68

Image Quilting

  • Idea:

– let’s combine random block placement of Chaos Mosaic with spatial constraints of Efros & Leung

  • Related Work (concurrent):

– Real-time patch-based sampling [Liang et.al. ’01] – Image Analogies [Hertzmann et.al. ’01]

slide-69
SLIDE 69

Input texture

block

B1 B2

Random placement

  • f blocks

B1 B2

Neighboring blocks constrained by overlap

B1 B2

Minimal error boundary cut

slide-70
SLIDE 70

Minimal error boundary

  • verlapping blocks

vertical boundary

_ _

= =

2 2

  • verlap error
  • min. error boundary
slide-71
SLIDE 71

Our Philosophy

  • The “Corrupt Professor’s Algorithm”:

– Plagiarize as much of the source image as you can – Then try to cover up the evidence

  • Rationale:

– Texture blocks are by definition correct samples

  • f texture so problem only connecting them

together

slide-72
SLIDE 72

Algorithm

– Pick size of block and size of overlap – Synthesize blocks in raster order – Search input texture for block that satisfies

  • verlap constraints (above and left)
  • Easy to optimize using NN search [Liang et.al., ’01]

– Paste new block into resulting texture

  • use dynamic programming to compute minimal error

boundary cut

slide-73
SLIDE 73
slide-74
SLIDE 74
slide-75
SLIDE 75
slide-76
SLIDE 76
slide-77
SLIDE 77
slide-78
SLIDE 78
slide-79
SLIDE 79
slide-80
SLIDE 80

Failures

(Chernobyl Harvest)

slide-81
SLIDE 81

Texture Transfer

  • Take the texture from one
  • bject and “paint” it onto

another object

– This requires separating texture and shape – That’s HARD, but we can cheat – Assume we can capture shape by boundary and rough shading

  • Then, just add another constraint when sampling:

Then, just add another constraint when sampling: similarity to underlying image at that spot similarity to underlying image at that spot

slide-82
SLIDE 82

parmesan

+ + = =

rice

+ + = =

slide-83
SLIDE 83

+ + = =

slide-84
SLIDE 84

= = + +

slide-85
SLIDE 85

Source texture Target image Source correspondence image Target correspondence image

slide-86
SLIDE 86

+ + = =

slide-87
SLIDE 87

Portilla & Simoncelli Xu, Guo & Shum

input image

Wei & Levoy Image Quilting

slide-88
SLIDE 88

Portilla & Simoncelli Xu, Guo & Shum

input image

Wei & Levoy Image Quilting

slide-89
SLIDE 89

Homage t o S hannon!

Portilla & Simoncelli Xu, Guo & Shum

input image

Wei & Levoy Image Quilting

slide-90
SLIDE 90

Summary of image quilting

  • Quilt together patches of input image

– randomly (texture synthesis) – constrained (texture transfer)

  • Image Quilting

– No filters, no multi-scale, no one-pixel-at-a-time! – fast and very simple – Results are not bad

slide-91
SLIDE 91

end