Thanks. I used above code and it got error in backward()
---------------------------------------------------------------------------
RuntimeError Traceback (most recent call last)
<ipython-input-22-88aafe38e0f1> in <module>()
15 loss =(loss * sample_weight / sample_weight.sum()).sum()
16 print (sample_weight.shape, loss.shape)
---> 17 loss.mean().backward()
18
19 #loss_total = torch.mean(loss * weights)
1 frames
/usr/local/lib/python3.6/dist-packages/torch/autograd/__init__.py in backward(tensors, grad_tensors, retain_graph, create_graph, grad_variables)
91 Variable._execution_engine.run_backward(
92 tensors, grad_tensors, retain_graph, create_graph,
---> 93 allow_unreachable=True) # allow_unreachable flag
94
95
RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn