This works but the weights and bias values are random. How can I assign values to them without training over samples?
x = Variable(torch.Tensor([, ]), requires_grad=True)
# creates a wx + b layer
f = nn.Linear(1,1)
y = f(x)
I have tried the following to assign values to ‘weight’ and ‘bias’
f.weight = 2.0
f.bias = 1.0
f.weight = torch.Tensor()
f.bias = torch.Tensor()
f.weight = nn.Parameter(torch.Tensor())
f.bias = nn.Parameter(torch.Tensor())
None seems to work.
If you want to operate directly on the tensors containing the parameters, you can access them using
I am not sure what is your context, but maybe
load_state_dict() are the recommended methods to save and load values for some model’s parameters.
Does this still apply now that
.data is deprecated alongside
Variable after PyTorch 0.4?
As you’ve mentioned, the
.data attribute and
Variables are deprecated, so you could wrap the code in a
with torch.no_grad() block and manipulate the parameters via e.g.