Daily Dose of Data Science

Daily Dose of Data Science

Home
Sponsor
Premium
Archive
Leaderboard
About

Deep learning

How Dropout Actually Works
A lesser-known detail of Dropout.
Sep 10 • 
Avi Chawla
5
Train Neural Nets 4-6x Faster!
Explained with code and visuals.
Sep 8 • 
Avi Chawla
3
1
6 Graph Feature Engineering Techniques
Must-know for building GNNs.
Jul 31 • 
Avi Chawla
5
DropBlock vs. Dropout for Regularizing CNNs
Addressing a limitation of Dropout when used in CNNs.
Jul 9 • 
Avi Chawla
3
Bias-Variance Tradeoff is Incomplete!
A counterintuitive phenomenon while training ML models.
Jul 7 • 
Avi Chawla
28
Scale ML Models to Billions of Parameters
...with 4 simple changes to your PyTorch code.
Jul 3 • 
Avi Chawla
2
15 Techniques to Optimize Neural Network Training
...explained in a single frame.
Jun 27 • 
Avi Chawla
4
TabM: A Powerful Alternative to MLP Ensemble
32x parameter reduction without accuracy loss.
Jun 6 • 
Avi Chawla
6
48 Most Popular Open ML Datasets
...summarized in a single frame.
Jun 2 • 
Avi Chawla
12
5 Chunking Strategies For RAG
...explained in a single frame.
May 29 • 
Avi Chawla
7
Memory Pinning to Accelerate Model Training
A simple technique, and some key considerations.
May 7 • 
Avi Chawla
5
Knowledge Distillation using Teacher Assistant
Improved model compression.
Apr 25 • 
Avi Chawla
2
© 2025 Avi Chawla
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture