Daily Dose of Data Science

Daily Dose of Data Science

Home
Sponsor
Premium
Archive
Leaderboard
About

Deep learning

6 Types of Contexts for AI Agents
...explained from a context engineering perspective.
Oct 17 • 
Avi Chawla
A Memory-efficient Technique to Train Large Models
...that even LLMs like GPTs and LLaMAs use.
Oct 14 • 
Avi Chawla
Activation Pruning for Model Compression (with implementation)
Removing 74% neurons with 0.5% accuracy drop.
Oct 2 • 
Avi Chawla
How Dropout Actually Works
A lesser-known detail of Dropout.
Sep 10 • 
Avi Chawla
Train Neural Nets 4-6x Faster!
Explained with code and visuals.
Sep 8 • 
Avi Chawla
6 Graph Feature Engineering Techniques
Must-know for building GNNs.
Jul 31 • 
Avi Chawla
DropBlock vs. Dropout for Regularizing CNNs
Addressing a limitation of Dropout when used in CNNs.
Jul 9 • 
Avi Chawla
Bias-Variance Tradeoff is Incomplete!
A counterintuitive phenomenon while training ML models.
Jul 7 • 
Avi Chawla
Scale ML Models to Billions of Parameters
...with 4 simple changes to your PyTorch code.
Jul 3 • 
Avi Chawla
15 Techniques to Optimize Neural Network Training
...explained in a single frame.
Jun 27 • 
Avi Chawla
TabM: A Powerful Alternative to MLP Ensemble
32x parameter reduction without accuracy loss.
Jun 6 • 
Avi Chawla
48 Most Popular Open ML Datasets
...summarized in a single frame.
Jun 2 • 
Avi Chawla
© 2025 Avi Chawla
Privacy ∙ Terms ∙ Collection notice
Start your SubstackGet the app
Substack is the home for great culture