Сhange weights in the net

I have

class Network(torch.nn.Module):
    def __init__(self, input_size, out_size):
        super(Network, self).__init__()
        self.input_size = input_size
        self.out_size = out_size
        self.lin = nn.Linear(self.input_size, self.out_size, bias = False)
        nn.init.zeros_(self.lin.weight)
net = Network(input_size, output_size)

i need net.lin.weight = net.lin.weight + delta

I get an error: TypeError: cannot assign ‘torch.FloatTensor’ as parameter ‘weight’ (torch.nn.Parameter or None expected)

How can I fix this?

As the error message suggests, wrap the new tensor in nn.Parameter:

with torch.no_grad():
    net.lin.weight = nn.Parameter(net.lin.weight + delta)