No gradient for making the max value 1 and others 0

Thanks @albanD. Does it mean “b = (x==x.max(dim=1,keepdim=True)[0]).type(torch.FloatTensor)” is actually not differentiable and so that x.grad is none?

And is there a way to keep the maximum value to be 1 and the others to be 0 in a tensor but the tensor still could have gradient? I found a similar question here Set Max value to 1, others to 0 - #2 by KFrank but it seems one hot is also not differentiable.