Manually copying weights is Bad?

Is copying weight in convnet layers a bad idea?
I was performing model inflation over 2D conv architecture (that is taking 2D kernel and stacking its copies together to form 3D kernel).
I manually copied the weights recursively from the 2D architecture but got the following error : “cuDNN requires contiguous weight tensor” when I ran a forward pass

Is it somehow due to the fact I manually copied the weights?
(forward pass worked before when I didnt copy the weights!)

It’s most likely due to your copies. Try to call .contiguous() on your weight tensors.

1 Like