Daily Dose of Data Science
Subscribe
Sign in
Home
Sponsor
Premium
Archive
Leaderboard
About
Deep learning
Latest
Top
Discussions
6 Types of Contexts for AI Agents
...explained from a context engineering perspective.
Oct 17
•
Avi Chawla
5
A Memory-efficient Technique to Train Large Models
...that even LLMs like GPTs and LLaMAs use.
Oct 14
•
Avi Chawla
1
Activation Pruning for Model Compression (with implementation)
Removing 74% neurons with 0.5% accuracy drop.
Oct 2
•
Avi Chawla
5
How Dropout Actually Works
A lesser-known detail of Dropout.
Sep 10
•
Avi Chawla
6
Train Neural Nets 4-6x Faster!
Explained with code and visuals.
Sep 8
•
Avi Chawla
3
1
6 Graph Feature Engineering Techniques
Must-know for building GNNs.
Jul 31
•
Avi Chawla
5
DropBlock vs. Dropout for Regularizing CNNs
Addressing a limitation of Dropout when used in CNNs.
Jul 9
•
Avi Chawla
3
Bias-Variance Tradeoff is Incomplete!
A counterintuitive phenomenon while training ML models.
Jul 7
•
Avi Chawla
29
Scale ML Models to Billions of Parameters
...with 4 simple changes to your PyTorch code.
Jul 3
•
Avi Chawla
2
15 Techniques to Optimize Neural Network Training
...explained in a single frame.
Jun 27
•
Avi Chawla
4
TabM: A Powerful Alternative to MLP Ensemble
32x parameter reduction without accuracy loss.
Jun 6
•
Avi Chawla
7
48 Most Popular Open ML Datasets
...summarized in a single frame.
Jun 2
•
Avi Chawla
12
This site requires JavaScript to run correctly. Please
turn on JavaScript
or unblock scripts