fast edge detection using structured forests
play

Fast Edge Detection Using Structured Forests by Piotr Dollar, C. - PowerPoint PPT Presentation

Fast Edge Detection Using Structured Forests by Piotr Dollar, C. Lawrence Zitnick (PAMI 2015) Vojt ech Cvr cek Reading Group Presentation June 6, 2018 Fast edge detection June 6, 2018 1 / 22 Problem Definition given a set of


  1. Fast Edge Detection Using Structured Forests by Piotr Dollar, C. Lawrence Zitnick (PAMI 2015) Vojtˇ ech Cvrˇ cek Reading Group Presentation June 6, 2018 Fast edge detection June 6, 2018 1 / 22

  2. Problem Definition given a set of training images and corresponding segmentation masks Fast edge detection June 6, 2018 2 / 22

  3. Problem Definition given a set of training images and corresponding segmentation masks predict an edge map for a novel input image Fast edge detection June 6, 2018 2 / 22

  4. Edge structure Fast edge detection June 6, 2018 3 / 22

  5. Random Forest - Single Tree h θ j : X → { L , R } x ∈ X f t : X → Y L R Fast edge detection June 6, 2018 4 / 22

  6. Random Forest - Single Tree h 1 j ( x ) = x ( k ) < τ x ∈ X θ 1 f t : X → Y f t : X → Y θ 1 j = ( k , τ ) h 2 j ( x ) = x ( k 1 ) − x ( k 2 ) < τ L R θ 2 θ 2 j = ( k 1 , k 2 , τ ) h θ j ( x ) = δ h 1 j ( x ) + (1 − δ ) h 2 j ( x ) θ 1 θ 2 δ ∈ { 0 , 1 } Fast edge detection June 6, 2018 4 / 22

  7. Random Forest - Single Tree x ∈ X f t : X → Y Fast edge detection June 6, 2018 4 / 22

  8. Random Forest - Single Tree x ∈ X f t : X → Y Fast edge detection June 6, 2018 4 / 22

  9. Random Forest - Single Tree x ∈ X f t : X → Y y 1 y 2 y 6 y 7 y 3 y 4 y 5 Fast edge detection June 6, 2018 4 / 22

  10. Training each tree is trained independently in a recursive manner Fast edge detection June 6, 2018 5 / 22

  11. Training each tree is trained independently in a recursive manner for a given node j and training set S j ⊂ X × Y , randomly sample parameters θ j from parameters space bad split Fast edge detection June 6, 2018 5 / 22

  12. Training each tree is trained independently in a recursive manner for a given node j and training set S j ⊂ X × Y , randomly sample parameters θ j from parameters space Select θ j resulting in a ’good’ split of the data good split Fast edge detection June 6, 2018 5 / 22

  13. Training - Information Gain Criterion information gain criterion: I j = I ( S j , S L j , S R j ) , (1) where S L j = { ( x , y ) ∈ S j | h ( x , θ ) = 0 } , S R j = S j \ S L j are splits Fast edge detection June 6, 2018 6 / 22

  14. Training - Information Gain Criterion information gain criterion: I j = I ( S j , S L j , S R j ) , (1) where S L j = { ( x , y ) ∈ S j | h ( x , θ ) = 0 } , S R j = S j \ S L j are splits θ j = argmax θ I j ( S j , θ ) Fast edge detection June 6, 2018 6 / 22

  15. Training - Information Gain Criterion information gain criterion: I j = I ( S j , S L j , S R j ) , (1) where S L j = { ( x , y ) ∈ S j | h ( x , θ ) = 0 } , S R j = S j \ S L j are splits θ j = argmax θ I j ( S j , θ ) for multiclass classification ( Y ⊂ Z ) the standard definition of information gain is: |S k j | � |S j | H ( S k I j = H ( S j ) − j ) (2) k ∈{ L , R } Fast edge detection June 6, 2018 6 / 22

  16. Training - Information Gain Criterion information gain criterion: I j = I ( S j , S L j , S R j ) , (1) where S L j = { ( x , y ) ∈ S j | h ( x , θ ) = 0 } , S R j = S j \ S L j are splits θ j = argmax θ I j ( S j , θ ) for multiclass classification ( Y ⊂ Z ) the standard definition of information gain is: |S k j | � |S j | H ( S k I j = H ( S j ) − j ) (2) k ∈{ L , R } where H ( S ) is either the Shannon entropy ( H ( S ) = − � y p y log ( p y )) or alternatively the Gini impurity ( H ( S ) = − � y p y (1 − p y )) Fast edge detection June 6, 2018 6 / 22

  17. Training training stops when a maximum depth is reached or if information gain or training set size fall below fixed threshold Fast edge detection June 6, 2018 7 / 22

  18. Training training stops when a maximum depth is reached or if information gain or training set size fall below fixed threshold single output y ∈ Y is assigned to a leaf node based on a problem specific ensemble model Fast edge detection June 6, 2018 7 / 22

  19. Testing combining results from multiple trees depends on a problem specific ensemble Fast edge detection June 6, 2018 8 / 22

  20. Testing combining results from multiple trees depends on a problem specific ensemble classification → majority voting Fast edge detection June 6, 2018 8 / 22

  21. Testing combining results from multiple trees depends on a problem specific ensemble classification → majority voting regression → averaging Fast edge detection June 6, 2018 8 / 22

  22. Structured Forests structured output space Y , e.g.: Fast edge detection June 6, 2018 9 / 22

  23. Structured Forests structured output space Y , e.g.: use of structured outputs in random forests presents following challenges: Fast edge detection June 6, 2018 9 / 22

  24. Structured Forests structured output space Y , e.g.: use of structured outputs in random forests presents following challenges: computing splits for structured output spaces of high dimensions/complexity is time consuming Fast edge detection June 6, 2018 9 / 22

  25. Structured Forests structured output space Y , e.g.: use of structured outputs in random forests presents following challenges: computing splits for structured output spaces of high dimensions/complexity is time consuming it is unclear how to define information gain ? Fast edge detection June 6, 2018 9 / 22

  26. Structured Forests structured output space Y , e.g.: use of structured outputs in random forests presents following challenges: computing splits for structured output spaces of high dimensions/complexity is time consuming it is unclear how to define information gain solution: mapping (somehow) structured output space Y into multiclass space C = { 1 , ..., k } I j - multiclass case Fast edge detection June 6, 2018 9 / 22

  27. Structured Edges Clustering How to efficiently compute splits? Fast edge detection June 6, 2018 10 / 22

  28. Structured Edges Clustering How to efficiently compute splits? Can the task be transformed into a multiclass problem? ? Y − → C = { 1 , ..., k } Fast edge detection June 6, 2018 10 / 22

  29. Structured Edges Clustering How to efficiently compute splits? Can the task be transformed into a multiclass problem? ? Y − → C = { 1 , ..., k } start with an intermediate mapping: Π : Y → Z (3) Fast edge detection June 6, 2018 10 / 22

  30. Structured Edges Clustering How to efficiently compute splits? Can the task be transformed into a multiclass problem? ? Y − → C = { 1 , ..., k } start with an intermediate mapping: Π : Y → Z (3) z = Π( y ) is a long binary vector, which encodes whether every pair of pixels in the y belongs to the same or different segment Fast edge detection June 6, 2018 10 / 22

  31. Structured Edges Clustering dimension of vectors z ∈ Z is reduced by PCA to m = 5, and clustering (k-means) splits Z into k = 2 clusters Fast edge detection June 6, 2018 11 / 22

  32. Structured Edges Clustering dimension of vectors z ∈ Z is reduced by PCA to m = 5, and clustering (k-means) splits Z into k = 2 clusters pairs PCA, k-means Y − − − → Z − − − − − − − − → C Fast edge detection June 6, 2018 11 / 22

  33. Structured Edges Clustering dimension of vectors z ∈ Z is reduced by PCA to m = 5, and clustering (k-means) splits Z into k = 2 clusters pairs PCA, k-means Y − − − → Z − − − − − − − − → C I j - multiclass case Fast edge detection June 6, 2018 11 / 22

  34. Structured Edges Clustering since the elements of Y are of size 16 x 16, the dimension of Z is � 256 � 2 Fast edge detection June 6, 2018 12 / 22

  35. Structured Edges Clustering since the elements of Y are of size 16 x 16, the dimension of Z is � 256 � 2 too expensive → randomly sample m = 256 dimensions of Z Fast edge detection June 6, 2018 12 / 22

  36. Structured Edges Clustering since the elements of Y are of size 16 x 16, the dimension of Z is � 256 � 2 too expensive → randomly sample m = 256 dimensions of Z sampled pairs PCA, k-means Y − − − − − − − − → Z − − − − − − − − → C Fast edge detection June 6, 2018 12 / 22

  37. Ensemble model during training, we need to assign a single prediction to a leaf node Fast edge detection June 6, 2018 13 / 22

  38. Ensemble model during training, we need to assign a single prediction to a leaf node during testing, we need to combine multiple predictions into one Fast edge detection June 6, 2018 13 / 22

  39. Ensemble model during training, we need to assign a single prediction to a leaf node during testing, we need to combine multiple predictions into one to select a single output from a set y 1 , ..., y k ∈ Y : Fast edge detection June 6, 2018 13 / 22

  40. Ensemble model during training, we need to assign a single prediction to a leaf node during testing, we need to combine multiple predictions into one to select a single output from a set y 1 , ..., y k ∈ Y : compute z i = Π y i Fast edge detection June 6, 2018 13 / 22

  41. Ensemble model during training, we need to assign a single prediction to a leaf node during testing, we need to combine multiple predictions into one to select a single output from a set y 1 , ..., y k ∈ Y : compute z i = Π y i select y k ∗ such that k ∗ = argmin k i , j ( z k , j − z i , j ) 2 (medoid) � Fast edge detection June 6, 2018 13 / 22

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend