Is torch.max(x,y) differentiable?

Given x, and y are one-hot vectors matrixes obtaining from Gumbel-softmax, and z = torch.max(x,y), Is z still differentiable?

Hi,

Assuming that x and y are floating point gradients, then the element-wise max is differentiable yes. The gradient will flow back, for each value, to the element that was used.

2 Likes