Daily Dose of Data Science

Daily Dose of Data Science

Home
Sponsor
Premium
Archive
Leaderboard
About

Deep learning

6 Types of Contexts for AI Agents
...explained from a context engineering perspective.
Oct 17 • 
Avi Chawla
5
A Memory-efficient Technique to Train Large Models
...that even LLMs like GPTs and LLaMAs use.
Oct 14 • 
Avi Chawla
1
Activation Pruning for Model Compression (with implementation)
Removing 74% neurons with 0.5% accuracy drop.
Oct 2 • 
Avi Chawla
5
How Dropout Actually Works
A lesser-known detail of Dropout.
Sep 10 • 
Avi Chawla
6
Train Neural Nets 4-6x Faster!
Explained with code and visuals.
Sep 8 • 
Avi Chawla
3
1
6 Graph Feature Engineering Techniques
Must-know for building GNNs.
Jul 31 • 
Avi Chawla
5
DropBlock vs. Dropout for Regularizing CNNs
Addressing a limitation of Dropout when used in CNNs.
Jul 9 • 
Avi Chawla
3
Bias-Variance Tradeoff is Incomplete!
A counterintuitive phenomenon while training ML models.
Jul 7 • 
Avi Chawla
29
Scale ML Models to Billions of Parameters
...with 4 simple changes to your PyTorch code.
Jul 3 • 
Avi Chawla
2
15 Techniques to Optimize Neural Network Training
...explained in a single frame.
Jun 27 • 
Avi Chawla
4
TabM: A Powerful Alternative to MLP Ensemble
32x parameter reduction without accuracy loss.
Jun 6 • 
Avi Chawla
7
48 Most Popular Open ML Datasets
...summarized in a single frame.
Jun 2 • 
Avi Chawla
12
© 2025 Avi Chawla
Privacy ∙ Terms ∙ Collection notice
Start your SubstackGet the app
Substack is the home for great culture