When I run this code;
Build a feed-forward network
model = nn.Sequential(nn.Linear(2352, 128),
nn.ReLU(),
nn.Linear(128, 64),
nn.ReLU(),
nn.Linear(64, 10),
nn.LogSoftmax(dim=1))
criterion = nn.CrossEntropyLoss()
dataiter = iter(trainloader)
images, labels = next(dataiter)
images = images.view(images.shape[0], -1)
Forward pass, get our log-probabilities
logits = model(images)
logits = Variable(torch.randn(10, 120).float(), requires_grad = True)
labels = Variable(torch.FloatTensor(10).uniform_(0, 120).long())
Calculate the loss with the logits and the labels
loss = criterion(logits, labels.squeeze())
than;
print(‘Before backward pass: \n’, model[0].weight.grad)
loss.backward()
print(‘After backward pass: \n’, model[0].weight.grad)
I get;
Before backward pass:
None
After backward pass:
None
Why I’m getting None grad after backward().
Could anyone help me please?