tutorials
play

Tutorials Interpretable Deep Learning: Towards Understanding & - PowerPoint PPT Presentation

Tutorials Interpretable Deep Learning: Towards Understanding & Explaining DNNs P a r t 2 : M e t h o d s o f E x p l a n a t i o n W o j c i e c h S a m e k , G r g o i r e M o n t a v


  1. Tutorials Interpretable Deep Learning: Towards Understanding & Explaining DNNs P a r t 2 : M e t h o d s o f E x p l a n a t i o n W o j c i e c h S a m e k , G r é g o i r e M o n t a v o n , K l a u s - R o b e r t M ü l l e r 1 / 3 6

  2. W h a t W i l l b e C o v e r e d i n P a r t 2 explaining interpreting individual decisions predicted classes 2 / 3 6

  3. E x p l a i n i n g I n d i v i d u a l D e c i s i o n s Q: Where in the image the neural networks sees evidence for a car? non-car car 3 / 3 6

  4. E x a m p l e s o f M e t h o d s t h a t E x p l a i n D e c i s i o n s 4 / 3 6

  5. E x p l a i n i n g I n d i v i d u a l D e c i s i o n s non-car car Q: In which proportion has each car contributed to the prediction? 5 / 3 6

  6. E x p l a i n i n g b y D e c o m p o s i n g G o a l : D e t e r m i n e t h e s h a r e o f t h e o u t p u t t h a t s h o u l d b e a t t r i b u t e d t o e a c h i n p u t v a r i a b l e . i n p u t D N N D e c o m p o s i t i o n p r o p e r t y : 6 / 3 6

  7. E x p l a i n i n g b y D e c o m p o s i n g G o a l : D e t e r m i n e t h e s h a r e o f t h e o u t p u t t h a t s h o u l d b e a t t r i b u t e d t o e a c h i n p u t v a r i a b l e . D e c o mp o s i n g a p r e d i c t i o n i s g e n e r a l l y d i f fi c u l t . 7 / 3 6

  8. S e n s i t i v i t y A n a l y s i s i n p u t e v i d e n c e f o r “ c a r ” D N N e x p l a n a t i o n f o r “ c a r ” ( h e a t m a p ) : c o m p u t e s f o r e a c h p i x e l : 8 / 3 6

  9. S e n s i t i v i t y A n a l y s i s Question: If sensitivity analysis computes a decomposition of something: Then, what does it decompose? 9 / 3 6

  10. S e n s i t i v i t y A n a l y s i s S e n s i t i v i t y a n a l y s i s e x p l a i n s a v a r i a t i o n o f t h e f u n c t i o n , n o t t h e f u n c t i o n v a l u e i t s e l f . i n p u t e x p l a n a t i o n f o r “ c a r ” v a r i a t i o n = m a k e s o m e t h i n g a p p e a r l e s s / m o r e a c a r . 1 0 / 3 6

  11. T h e T a y l o r E x p a n s i o n A p p r o a c h r o o t p o i n t 1 . T a k e a l i n e a r m o d e l : 2 . F i r s t - o r d e r e x p a n s i o n a t r o o t p o i n t : 3 . I d e n t i f y i n g l i n e a r t e r m s : a d e c o mp o s i t i o n e x p l a n a t i o n d e p e n d s o n t h e r o o t p o i n t . O b s e r v a t i o n : 1 1 / 3 6

  12. T h e T a y l o r E x p a n s i o n A p p r o a c h r o o t p o i n t O b t a i n e d r e l e v a n c e s c o r e s H o w t o c h o o s e t h e r o o t p o i n t ? - C l o s e n e s s t o t h e a c t u a l d a t a p o i n t - M e m b e r s h i p t o t h e i n p u t d o m a i n ( e . g . p i x e l s p a c e ) - M e m b e r s h i p t o t h e d a t a m a n i f o l d . 1 2 / 3 6

  13. N o n - L i n e a r M o d e l s N o n l i n e a r m o d e l s e c o n d - o r d e r t e r m s a r e h a r d t o i n t e r p r e t a n d c a n b e v e r y l a r g e S i mp l e T a y l o r d e c o mp o s i t i o n i s n o t s u i t a b l e f o r h i g h l y n o n - l i n e a r mo d e l s . 1 3 / 3 6

  14. O v e r c o m i n g N o n L i n e a r i t y I n t e g r a t e d G r a d i e n t s : [ S u n d a r a r a j a n ’ 1 7 ] • F u l l y d e c o m p o s a b l e • R e q u i r e c o m p u t i n g a n i n t e g r a l ( e x p e n s i v e ) • W h i c h i n t e g r a t i o n p a t h ? 1 4 / 3 6 [ S u n d a r a r a j a n ’ 1 7 ] A x i o m a t i c A t t r i b u t i o n f o r D e e p N e t w o r k s . I C M L 2 0 1 7 : 3 3 1 9 - 3 3 2 8

  15. O v e r c o m i n g N o n L i n e a r i t y S p e c i a l c a s e w h e n t h e o r i g i n i s a r o o t p o i n t a n d t h e g r a d i e n t a l o n g t h e i n t e g r a t i o n p a t h i s c o n s t a n t : g r a d i e n t x i n p u t 1 5 / 3 6

  16. Let’s consider a different approach ... 1 6 / 3 6

  17. O v e r c o m i n g N o n L i n e a r i t y V i e w t h e d e c i s i o n a s a g r a p h c o mp u t a t i o n i n s t e a d o f a f u n c t i o n e v a l u a t i o n , a n d p r o p a g a t e t h e d e c i s i o n b a c k w a r d s u n t i l t h e i n p u t i s r e a c h e d . 1 7 / 3 6

  18. L a y e r - W i s e R e l e v a n c e P r o p a g a t i o n ( L R P ) [ B a c h ’ 1 5 ] 1 8 / 3 6

  19. G r a d i e n t - B a s e d v s . L R P 1 9 / 3 6

  20. L a y e r - W i s e R e l e v a n c e P r o p a g a t i o n ( L R P ) [ B a c h ’ 1 5 ] neuron contribution C a r e f u l l y e n g i n e e r e d p r o p a g a t i o n r u l e : available for redistribution pooling normalization received messages term 2 0 / 3 6

  21. L R P P r o p a g a t i o n R u l e s : T w o V i e w s View 1: neuron contribution available for redistribution pooling normalization received messages term available for View 2: redistribution neuron activation weighted sum normalization 2 1 / 3 6 term

  22. I m p l e m e n t i n g P r o p a g a t i o n R u l e s ( 1 ) available for redistribution neuron activation weighted sum normalization term E l e m e n t - w i s e o p e r a t i o n s V e c t o r o p e r a t i o n s 2 2 / 3 6

  23. I m p l e m e n t i n g P r o p a g a t i o n R u l e s ( 2 ) available for redistribution neuron activation C o d e t h a t r e u s e s f o r w a r d a n d g r a d i e n t c o m p u t a t i o n s : weighted sum normalization term S e e a l s o h t t p : / / w w w . h e a t m a p p i n g . o r g / t u t o r i a l 2 3 / 3 6

  24. H o w F a s t i s L R P ? G P U - b a s e d i m p l e m e n t a t i o n o f L R P : C h e c k o u t i N N v e s t i g a t e [ A l b e r ’ 1 8 ] h t t p s : / / g i t h u b . c o m / a l b e r m a x / i n n v e s t i g a t e 2 4 / 3 6

  25. Is there an underlying mathe- matical framework for LRP? 2 5 / 3 6

  26. D e e p T a y l o r D e c o m p o s i t i o n [ M o n t a v o n ’ 1 7 ] S u p p o s e t h a t w e h a v e p r o p a g a t e d t h e r e l e v a n c e u n t i l Q u e s t i o n : a g i v e n l a y e r . H o w s h o u l d i t b e p r o p a g a t e d o n e l a y e r f u r t h e r ? I d e a : B y p e r f o r m i n g a T a y l o r e x p a n s i o n o f t h e r e l e v a n c e . 2 6 / 3 6

  27. T h e S t r u c t u r e o f R e l e v a n c e available for redistribution neuron R e mi n d e r : activation weighted sum normalization term O b s e r v a t i o n : R e l e v a n c e a t e a c h l a y e r i s a p r o d u c t o f t h e a c t i v a t i o n a n d a n a p p r o x i m a t e l y l o c a l l y c o n s t a n t t e r m . 2 7 / 3 6

  28. D e e p T a y l o r D e c o m p o s i t i o n R e l e v a n c e n e u r o n : T a y l o r e x p a n s i o n : R e d i s t r i b u t i o n : 2 8 / 3 6

  29. C h o o s i n g t h e R o o t P o i n t ( D e e p T a y l o r g e n e r i c ) C h o i c e o f r o o t p o i n t 1 . n e a r e s t r o o t ✔ 2 . r e s c a l e d e x c i t a t i o n s (same as LRP - ) α β 1 0 2 9 / 3 6

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend