This is correct. bools are discrete, so it doesn’t make sense to try to
differentiate them and autograd doesn’t try.
(More specifically, the tensor 0.2 < tseq itself has dtypebool and
pytorch has already set its requires_grad property to False, even
though it was computed from a tensor that had requires_grad = True.
So the problem has already occurred before the logical_and() function.)