I have a question. The hack you proposed is great, but we actually need the on-fly transformations every single epoch assuming that we are using a scholastic augmentation (e.g.RandAugment()) to expose the model to new alterations every single training step, which is the common approach everywhere as it significantly boosts model's generalization ability.
Nice Blog Sir. Instead of using the shuffle, Its better to use `sampler = RandomSample(dataset, replacement=False, num_sample=len(dataset)` which helps us to use every sample only for once.
The only point why I can think of the transformations can be good on fly, is to have augmentations, cause if in every epoch transformations are applied, then it might be good for robustness. The only worry one could have is do they want their transformed data in first epoch transformed again or not.
Maybe you will clarify this in next blog, just thought to share my thoughts
Thanks 👍
I have a question. The hack you proposed is great, but we actually need the on-fly transformations every single epoch assuming that we are using a scholastic augmentation (e.g.RandAugment()) to expose the model to new alterations every single training step, which is the common approach everywhere as it significantly boosts model's generalization ability.
Nice Blog Sir. Instead of using the shuffle, Its better to use `sampler = RandomSample(dataset, replacement=False, num_sample=len(dataset)` which helps us to use every sample only for once.
Pretty Awesome,
Thanks for pointing it out :)
The only point why I can think of the transformations can be good on fly, is to have augmentations, cause if in every epoch transformations are applied, then it might be good for robustness. The only worry one could have is do they want their transformed data in first epoch transformed again or not.
Maybe you will clarify this in next blog, just thought to share my thoughts