Time-aware Large Kernel Convolutions
Vasileios Lioutas and Yuhong Guo ICML | 2020
Time-aware Large Kernel Convolutions Vasileios Lioutas and Yuhong - - PowerPoint PPT Presentation
ICML | 2020 Time-aware Large Kernel Convolutions Vasileios Lioutas and Yuhong Guo ICML | 2020 Brief Overview In this work, we introduce a novel sequence modeling approach called TaLK convolutions that is not based on self-attention.
Vasileios Lioutas and Yuhong Guo ICML | 2020
is not based on self-attention.
suggest that this method can yield comparative results with other self-attention and convolution based competitive methods.
convolution kernel. ICML | 2020
video processing, time-series etc.
[Karpathy, 2015]
abstraction. ICML | 2020
ICML | 2020
ICML | 2020
ICML | 2020
ICML | 2020
Currently, self-attention is considered vital for modern sequence learning approaches.
limited context window.
ICML | 2020
method? ICML | 2020
appropriate number of vector representations together. where are the left and right offsets (boundaries). ICML | 2020
ICML | 2020
time:
again and again.
prefix sum algorithm. ICML | 2020
where ICML | 2020
where are the maximum allowed tokens to the left and to the right. ICML | 2020
representation values passed to the next layers.
direction.
tokens that are needed to model a timestep. ICML | 2020
groups. where .
timestep. ICML | 2020
ICML | 2020
ICML | 2020
ICML | 2020
ICML | 2020
ICML | 2020
ICML | 2020
ICML | 2020
convolutions without using self-attention or a variant of it.
way.
github.com/lioutasb/TaLKConvolutions
Thank you!
ICML | 2020