It's possible, Mahdi. If you start by setting a seed for the random generator, you can ensure that the two models start with the same parameters when initialized. here's how I did it:
- np.random.seed(20)
- torch.manual_seed(20)
Train the first model. Now restart the notebook and this time, train the second model. This will ensure that Model2 gets initialised with the same weights as the first once. You can verify this as well experimentally :)
Thanks for sharing. Some kind of bug like that can set you back months.
Dear Avi, thanks for sharing!
I have a question: how can two models start with the identical weights? As far as I know, the weights are first determined at random.
It's possible, Mahdi. If you start by setting a seed for the random generator, you can ensure that the two models start with the same parameters when initialized. here's how I did it:
- np.random.seed(20)
- torch.manual_seed(20)
Train the first model. Now restart the notebook and this time, train the second model. This will ensure that Model2 gets initialised with the same weights as the first once. You can verify this as well experimentally :)
Thank you, Avi for your reply! I'll check it!