6 Comments

Hey Avi, can you make a post about dimensionality reduction techniques and their mathematical grounding (derivation) such as MDS, SVD, random projections, etc.?

Expand full comment
author

Of course, Omar. Thanks so much for the suggestion :)

I have added this to my list and will surely cover this in an upcoming deep dive as this will get pretty long.

Expand full comment

Thanks, that's awesome! And ooof, I bet.

Expand full comment
Feb 24Liked by Avi Chawla

Thanks. Can we use both activation pruning and layers.Dropout() together to boost both efficiency and performance? My understanding is that both involve manipulation of neurons.

Expand full comment
author

Dropout is used during training. Activation pruning is post-training stuff. The network has been trained and then you prune it using activation pruning.

Expand full comment
Feb 24Liked by Avi Chawla

Oh. Thanks a lot. Bad on my part. I don't know why I mixed up. Perhaps biased mindset; was not paying enough attention while reading.

Expand full comment