Hi,

I fine-tune a transformer + linear layer on different few-shot data and I then evaluate the model on a test set. However, it looks like the model weights are being initialised differently each time. While ideally they should converge as training converges, I was wondering if it is possible to initialise all linear weights with the same values each time.

Thanks!