[SOLVED] Grad is None. Seems like network is not updated

I’m building CNN with custom loss function.

But it seems like weights of CNN is not updated.

At every iteration, CNN creates same prediction.

When I try to see grad values from,

print(predicted_image.grad) # Shows None

for param in my_net.parameters():
    print("param",param) # Shows tensor array filled with float numbers
    print("param.grad",param.grad) # Shows "None"
    print(param.grad.data.sum()) # Shows 'NoneType' object has no attribute 'data'

I can’t see grad values.

I can’t figure out what I falsely did even with searches.

Did you check about param.requires_grad ?
Or do you properly backpropagate the loss?

Hi, that check you suggested prints True from

for param in direct_intrinsic_net.parameters():
    print("param.requires_grad",param.requires_grad) # True

I’m not sure if I correctly performed backpropagation.

I performed backpropagation like this

# Initialize all gradients before I pass image data into network
# I also checked using zero_grad() after forward pass but there was no difference
optimizer.zero_grad()

# I obtained summed_loss, then, I used the following
summed_loss=Variable(summed_loss,requires_grad=True)

# Check loss value, but loss values are constant at every iteration
print(summed_loss) # tensor([4.6291], device='cuda:0')

summed_loss.backward()

optimizer.step()

I think I may had not implemented custom loss function correctly by inheriting nn.Module or Function classes.
So, this question is being closed.