🔮 Sales prediction with Temporal Convolutional Networks (TCN)#
Today I was reviewing a very interesting project on time series forecasting using PyTorch and a less-known but very powerful architecture: Temporal Convolutional Networks (TCN).
💡 Why is it interesting?
Because TCNs allow processing large time windows in parallel, avoiding the classic limitations of recurrent models like LSTM or GRU. That makes them ideal for problems such as sales, demand, energy, or any complex temporal signal.
🚀 Key ideas of the approach#
- 📦 Real dataset: store and category sales (Kaggle).
- 🧱 TCN architecture: 1D convolutions + dilation to expand temporal “receptive field.”
- ⚙️ Input window: 120 days → Prediction: 16 days.
- 🧹 Preprocessing: pivot data, scale, create sequences.
- 🧠 PyTorch training: compact, efficient model and easy to parallelize.
- 📤 Final outcome: predictions ready to submit to Kaggle.
🧩 Quick explanation#
Imagine you want to predict sales by looking at the past.
A TCN works like a magnifying glass moving over the time series, but with a trick:
🔍 the lens can “jump” in time thanks to dilation, allowing it to see distant patterns without recurrent networks.
Instead of remembering step by step (like an LSTM), the TCN analyzes many past points at once, which makes it faster and more stable.
More information at the link 👇

