convolu on
play

Convolu'on NEU 466M Instructor: Professor Ila R. Fiete Spring 2016 - PowerPoint PPT Presentation

Convolu'on NEU 466M Instructor: Professor Ila R. Fiete Spring 2016 Convolu'on g: a 'me-varying signal { g t 1 , g t , g t +1 } sampled at discrete intervals h: another 'me-varying signal, { h t 1 , h t , h t


  1. Convolu'on NEU 466M Instructor: Professor Ila R. Fiete Spring 2016

  2. Convolu'on g: a 'me-varying signal {· · · g t − 1 , g t , g t +1 · · · } sampled at discrete intervals h: another 'me-varying signal, {· · · h t − 1 , h t , h t +1 · · · } not necessarily of same length ∞ X ( g ∗ h )( n ) = g ( n − m ) h ( m ) m = −∞ Finite-length g, h: g*h has length N+M-1, where N=length(g), M=length(h).

  3. Proper'es of convolu'on ∞ X ( g ∗ h )( n ) = g ( n − m ) h ( m ) m = −∞ • Commuta've/symmetric (unlike cross- correla'on): g ∗ h = h ∗ g • Associa've: f ∗ ( g ∗ h ) = ( f ∗ g ) ∗ h • Distribu've: f ∗ ( g + h ) = f ∗ g + f ∗ h

  4. Convolu'on ∞ X ( g ∗ h )( n ) = g ( n − m ) h ( m ) m = −∞ Typically, one short series, one long. • Long series called “signal”; the other called the “kernel”. • The convolu'on is viewed as a weighted version/moving average of the by • the kernel.

  5. Convolu'on ∞ X ( g ∗ h )( n ) = g ( n − m ) h ( m ) m = −∞ Say h: kernel (short/”finite support”), g: signal (long). [ · · · g n − 3 g n − 2 g n − 1 g n g n +1 g n +2 g n +3 · · · ] [ · · · 0 0 · · · ] h 2 h 1 h 0 h − 1 h − 2 ( g ∗ h ) n

  6. Convolu'on ∞ X ( g ∗ h )( n ) = g ( n − m ) h ( m ) m = −∞ [ · · · g n − 2 g n − 1 g n g n +1 g n +2 g n +3 g n +4 · · · ] [ · · · 0 0 · · · ] h 2 h 1 h 0 h − 1 h − 2 ( g ∗ h ) n +1 Flip h (kernel) and keep it in one place, move g -tape (signal) leZ.

  7. Convolu'on “signal” g · · · · · · h “kernel” 0 ∞ X ( g ∗ h )( n ) = g ( n − m ) h ( m ) m = −∞ · · · · · · g ∗ h Flip h (kernel), keep g -tape (signal) fixed, sweep h (kernel) rightward.

  8. Convolu'on g · · · · · · − h 0 − − − g ∗ h −

  9. Convolu'on g · · · · · · − h 0 − − − g ∗ h −

  10. Convolu'on g · · · · · · − h 0 − − − g ∗ h −

  11. Convolu'on g · · · · · · − h 0 − − − g ∗ h −

  12. Convolu'on g · · · · · · − h 0 − − − g ∗ h −

  13. Convolu'on g · · · · · · − h 0 − − − g ∗ h −

  14. Convolu'on g · · · · · · − h 0 − − − g ∗ h −

  15. Convolu'on g · · · · · · − h 0 − − − g ∗ h −

  16. Convolu'on g · · · · · · − h 0 − − − g ∗ h −

  17. Convolu'on g · · · · · · − h 0 − − − g ∗ h −

  18. Convolu'on g · · · · · · − h 0 − − − g ∗ h −

  19. Common convolu'on kernels • Boxcar h = 1 /N for N samples, 0 elsewhere (e.g. rates from spikes) h = 1 • Exponen'al τ e − t/ τ for t > 0, 0 otherwise (e.g. EPSPs from spikes). Called a linear low-pass filter. 1 • Gaussian 2 πσ e − t 2 / 2 σ 2 h = √ (e.g. smoothing)

  20. Spikes to rate, smoothing, EPSPs MATLAB DEMOS

  21. Edge detec'on, HDR imaging RETINA AS A CONVOLUTIONAL FILTER

  22. Mach bands (Ernst Mach 1860’s) Eight bars of stepped grayscale intensity. Each bar: constant intensity.

  23. Interes'ng perceptual effect in Mach bands lighter darker Illumina'on at a point on the re'na is not perceived objec'vely, but only in reference to its neighbours. Why/how does this happen?

  24. Electrophysiology of a re'nal ganglion cell (RGC) Difference-of-Gaussians or center-surround recep've field.

  25. Anatomy of RGC circuit Off-center (surround) s'mulus has reverse effect of on-center s'mulus because of inhibitory horizontal cells.

  26. Reproducing the Mach band illusion RETINAL KERNEL SENSES CHANGES IN ILLUMINATION (MATLAB)

  27. Ganglion cells code contrast: difference in brightness between center and surround

  28. Re'nal filter performs edge detec'on

  29. Demo RETINAL KERNEL AS EDGE DETECTOR (MATLAB)

  30. How edge detec'on works: theory 1 Smoothing filter: 2 πσ 2 e − x 2 / 2 σ 2 H σ ( x ) = Gaussian √ 1 1 e − x 2 / 2 σ 2 e − x 2 / 2 σ 2 Re'nal filter model: H retinal ( x ) = 1 − α 2 p p 2 πσ 2 2 πσ 2 1 2 = H σ 1 ( x ) − H σ 2 ( x ) σ 1 < σ 2 difference-of-Gaussians How to interpret Re'nal/difference-of-Gaussians filter?

  31. How edge detec'on works: theory − d 2 dx 2 H σ ( x ) = − d 2 1 2 πσ 2 e − x 2 / 2 σ 2 √ dx 2 H σ ( x ) − x 2 ✓ ◆ = 1 σ 2 H σ ( x ) σ 2 ≈ H σ 1 ( x ) − H σ 2 ( x ) difference-of-Gaussians with some σ 2 < σ 1 D. Marr, E. Hildreth (1980) "Theory of edge detec'on." Proc. R. Soc. Lond. B, 207:187-217.

  32. How edge detec'on works: theory − d 2 dx 2 H σ ( x ) = − d 2 1 2 πσ 2 e − x 2 / 2 σ 2 √ dx 2 H σ ( x ) − x 2 ✓ ◆ = 1 σ 2 H σ ( x ) σ 2 ≈ H σ 1 ( x ) − H σ 2 ( x ) ≈ σ 2 = 3 − − 2 σ 1 − −

  33. How edge detec'on works: theory H retinal ≈ ( H σ 1 ( x ) − H σ 2 ( x )) ≈ − d 2 dx 2 H σ ( x ) 2 nd deriva've smoothing Re'nal filter (difference-of-Gaussians) is like smoothing filter followed by a 2 nd deriva've filter: H retinal ≈ H 2 nd diff ∗ H smooth

  34. High dynamic-range imaging Re'nex-based adap've filter: global compression, local processing

  35. Comparison: convolu'on, cross- correla'on, autocorrela'on Image: wikimedia commons hjps://en.wikipedia.org/wiki/Convolu'on

  36. Summary • Convolu'on: a kernel (short) acts on a signal (long), to produce a locally reweighted version of the signal. • Useful in engineering sense: smooth signals, extract rates from spikes, template matching, other processing. • Opera'ons of re'na on visual s'mulus may be interpreted as convolu'on. • Re'nal difference-of-Gaussians convolu'on: edge enhancement, edge detec'on, contrast normaliza'on.

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend