Temporal Convolutional Networks

Temporal Convolutional Networks, or TCNs, are convolutional networks designed to process sequential data. They have two core components: causal convolutions, and dilated convolutions. Causal convolutions are needed to ensure the prediction at a given time point depends only on information from previous time points. Dilated convolutions allow a TCN to have a wide receptive field without the need for too many layers. The dilation factor typically grows exponentially as a function of layer depth.
Related concepts:
Convolutional Neural Network
Related video:
https://youtube.com/shorts/mxbum3mXtkw
External reference:
https://arxiv.org/abs/1611.05267