Saliency Detection Method Christine Sawyer Santa Barbara City - - PowerPoint PPT Presentation

saliency detection method
SMART_READER_LITE
LIVE PREVIEW

Saliency Detection Method Christine Sawyer Santa Barbara City - - PowerPoint PPT Presentation

Evaluating Context-Aware Saliency Detection Method Christine Sawyer Santa Barbara City College Computer Science & Mechanical Engineering Mentors: Jiejun Xu & Zefeng Ni Advisor: Prof. B.S. Manjunath Vision Research Lab Funding: Office


slide-1
SLIDE 1

Evaluating Context-Aware Saliency Detection Method

Christine Sawyer Santa Barbara City College Computer Science & Mechanical Engineering Mentors: Jiejun Xu & Zefeng Ni Advisor: Prof. B.S. Manjunath Vision Research Lab

Funding: Office of Naval Research Defense University Research Instrumentation Program

slide-2
SLIDE 2

What is Visual Saliency?

slide-3
SLIDE 3

What is Visual Saliency?

  • Visual Saliency – Subjective perceptual quality which

makes certain items stand out more than others.

slide-4
SLIDE 4

What is Visual Saliency?

  • Visual Saliency – Subjective perceptual quality which

makes certain items stand out more than others.

  • Mimic human perception

Original Image Human Fixations Bruce et al.

slide-5
SLIDE 5

 Using EyeLink1000 as a tool

  • High Speed Infrared Camera
  • Illuminator

Learning gaze patterns by tracking eye movement

slide-6
SLIDE 6

 Using EyeLink1000 as a tool

  • High Speed Infrared Camera
  • Illuminator

Learning gaze patterns by tracking eye movement

slide-7
SLIDE 7

 Using EyeLink1000 as a tool

  • High Speed Infrared Camera
  • Illuminator
  • Potential applications
  • Image Segmentation
  • Image Retargeting
  • Image Search & Retrieval

Learning gaze patterns by tracking eye movement

slide-8
SLIDE 8

 Using EyeLink1000 as a tool

  • High Speed Infrared Camera
  • Illuminator
  • Potential applications
  • Image Segmentation
  • Image Retargeting
  • Image Search & Retrieval

Learning gaze patterns by tracking eye movement

slide-9
SLIDE 9

Looking at the context of an image

slide-10
SLIDE 10

Looking at the context of an image

  • Sometimes looking just dominant object is not enough.
slide-11
SLIDE 11

Looking at the context of an image

  • Sometimes looking just dominant object is not enough.
  • Context-Aware Saliency - Extract salient object with its

surroundings that add meaning to image.

slide-12
SLIDE 12

Context-Aware Saliency Detection

  • 4 basic principles of human visual attention

[Goferman et al.]

slide-13
SLIDE 13

Context-Aware Saliency Detection

  • 4 basic principles of human visual attention
  • Use eye tracker to evaluate algorithm

– What do people look at to determine the scenario of image?

[Goferman et al.]

slide-14
SLIDE 14

Context-Aware Saliency Detection

  • 4 basic principles of human visual attention
  • Use eye tracker to evaluate algorithm

– What do people look at to determine the scenario of image? – Viewing Time – Categories

[Goferman et al.]

slide-15
SLIDE 15

The effects in lengths of time

2 Seconds

slide-16
SLIDE 16

The effects in lengths of time

2 Seconds 5 Seconds

  • In depth analysis
  • Dominant object
  • Surroundings
slide-17
SLIDE 17

How categories affects how you look

  • Sports

– Person(s) participating – Sports equipment

slide-18
SLIDE 18

How categories affects how you look

  • Sports

– Person(s) participating – Sports equipment

slide-19
SLIDE 19

Insight from preliminary experiments

  • Need to give test participants a specific task

– People aimlessly search images when given no task. – People get distracted based on prior knowledge.

slide-20
SLIDE 20

Insight from preliminary experiments

  • Need to give test participants a specific task

– People aimlessly search images when given no task. – People get distracted based on prior knowledge.

slide-21
SLIDE 21

Insight from preliminary experiments

  • Need to give test participants a specific task

– People aimlessly search images when given no task. – People get distracted based on prior knowledge.

  • Time constraints

– 4 seconds

slide-22
SLIDE 22

Experimental Process

  • 60 images from various categories shown for 4 seconds to

each of the 17 viewers.

slide-23
SLIDE 23

Experimental Process

  • 60 images from various categories shown for 4 seconds to

each of the 17 viewers.

slide-24
SLIDE 24

Experimental Process

  • 60 images from various categories shown for 4 seconds to

each of the 17 viewers.

  • Task: Look at the parts that best describe the image and

give brief description of scene.

slide-25
SLIDE 25

Experimental Process

  • 60 images from various categories shown for 4 seconds to

each of the 17 viewers.

  • Task: Look at the parts that best describe the image and

give brief description of scene.

  • Goal: Evaluate Context-Aware Saliency and create a data

set that can provide ground truth data.

slide-26
SLIDE 26

Categories of Results

  • Algorithm matches human perception
  • Algorithm partially matches human perception
  • Algorithm does not match human perception
slide-27
SLIDE 27

Algorithm matches human perception

  • Image has simple background
  • Salient portion(s) have distinct differences in color

and/or texture

Original Image Context-Aware Saliency Algorithm

slide-28
SLIDE 28

Experiment Results

slide-29
SLIDE 29

Matching human perception

slide-30
SLIDE 30

Matching human perception

slide-31
SLIDE 31

Matching human perception

slide-32
SLIDE 32

Algorithm misses part of the salient portion

  • Image has simple foreground

– People look more at high level features like faces – The salient portion could be a similar color and/or texture as its surroundings

Original Image Context-Aware Saliency Algorithm

slide-33
SLIDE 33

Experiment Results

slide-34
SLIDE 34

Partially matching human perception

slide-35
SLIDE 35

Partially matching human perception

slide-36
SLIDE 36

Partially matching human perception

slide-37
SLIDE 37

Algorithm differs from human perception

  • The image is very busy
  • The dominant object is not obvious

Context-Aware Saliency Algorithm Original Image

slide-38
SLIDE 38

Experiment Results

slide-39
SLIDE 39

Contrasting human perception

slide-40
SLIDE 40

Contrasting human perception

slide-41
SLIDE 41

Contrasting human perception

slide-42
SLIDE 42

Conclusion and Future Plans

  • Match to human perception

– Simple background and distinct foreground

  • Partial match to human perception

– Plain foreground with more complex background

  • Contrast to human perception

– Busy image – Unclear main object

slide-43
SLIDE 43

Conclusion and Future Plans

  • Match to human perception

– Simple background and distinct foreground

  • Partial match to human perception

– Plain foreground with more complex background

  • Contrast to human perception

– Busy image – Unclear main object

  • Effects of...

– Blurring and noise in image – People's prior knowledge/background

slide-44
SLIDE 44

References

[1] Stas Goferman, Lihi Zelnik-Manor, and Ayellet Tal, "Context-Aware Saliency Detection", IEEE International Conference on Computer Vision and Pattern Recognition, 2010 [2] Wei Wang1,3,4, Yizhou Wang1,2, Qingming Huang1,4, Wen Gao, “Measuring Visual Saliency by Site Entropy Rate”, IEEE International Conference on Computer Vision and Pattern Recognition, 2010 [3] L. Itti, C. Koch, and E. Niebur. A model of saliency based visual attention for rapid scene analysis. IEEE TPAMI, 1998 [4] N.D. Bruce and J. Tsotsos. Saliency based on information

  • maximization. NIPS, 2006

[5] J. Harel, C. Koch, and P. Perona. Graph-based visual saliency. NIPS, 2006 [6] X. Hou and L. Zhang. Dynamic visual attention: searching for coding length increments. NIPS, 2008

slide-45
SLIDE 45

Acknowledgements

  • INSET
  • Prof. Manjunath
  • Jiejun Xu & Zefeng Ni
  • Vision Research Lab
  • Volunteers for my experiment
  • Professors, Family, & Friends