Hey Avi, can you make a post about dimensionality reduction techniques and their mathematical grounding (derivation) such as MDS, SVD, random projections, etc.?
Thanks. Can we use both activation pruning and layers.Dropout() together to boost both efficiency and performance? My understanding is that both involve manipulation of neurons.
Dropout is used during training. Activation pruning is post-training stuff. The network has been trained and then you prune it using activation pruning.
Hey Avi, can you make a post about dimensionality reduction techniques and their mathematical grounding (derivation) such as MDS, SVD, random projections, etc.?
Of course, Omar. Thanks so much for the suggestion :)
I have added this to my list and will surely cover this in an upcoming deep dive as this will get pretty long.
Thanks, that's awesome! And ooof, I bet.
Thanks. Can we use both activation pruning and layers.Dropout() together to boost both efficiency and performance? My understanding is that both involve manipulation of neurons.
Dropout is used during training. Activation pruning is post-training stuff. The network has been trained and then you prune it using activation pruning.
Oh. Thanks a lot. Bad on my part. I don't know why I mixed up. Perhaps biased mindset; was not paying enough attention while reading.