6 Comments
User's avatar
Omar AlSuwaidi's avatar

Hey Avi, can you make a post about dimensionality reduction techniques and their mathematical grounding (derivation) such as MDS, SVD, random projections, etc.?

Expand full comment
Avi Chawla's avatar

Of course, Omar. Thanks so much for the suggestion :)

I have added this to my list and will surely cover this in an upcoming deep dive as this will get pretty long.

Expand full comment
Omar AlSuwaidi's avatar

Thanks, that's awesome! And ooof, I bet.

Expand full comment
Srinivas's avatar

Thanks. Can we use both activation pruning and layers.Dropout() together to boost both efficiency and performance? My understanding is that both involve manipulation of neurons.

Expand full comment
Avi Chawla's avatar

Dropout is used during training. Activation pruning is post-training stuff. The network has been trained and then you prune it using activation pruning.

Expand full comment
Srinivas's avatar

Oh. Thanks a lot. Bad on my part. I don't know why I mixed up. Perhaps biased mindset; was not paying enough attention while reading.

Expand full comment