How to change the params in one conv1d layer

import torch
import torch.nn as nn
import pdb
import torch.nn.init as init
# Third convolutional layer
conv3 = nn.Conv1d(64, 128, kernel_size=3)

ab = conv3.weight

new_params = torch.ones(128, 2, 3)  # Example: generate new parameters[:, 1:3, :] = new_params
cd = conv3.weight
print(sum(cd - ab))

For the above code, I think ab is the previous params of conv3, and cd is the new replaced weights and they are different. But why they are the same value. Thanks for your reply.

Don’t use the deprecated .data attribute, but copy the new values directly to the weight in a no_grad context:

with torch.no_grad():
    conv3.weight[:, 1:3].copy_(new_params)