[Resolved] Missing gradients do to silly typo

I have two tensors of 3 features that are linked to their own rank values.

I’m attempting to predict & learn in a pairwise fashion if I have a lower or higher rank match, however, it seems my problem is I cannot compute my gradients?
(The model here is simplified for debugging purposes.)
I was hoping someone could give me some pointers.
Example:
f1 = [1,3,5]. f2 = [1,4,5], y1 = 1, y2 = 5
If f1 scores lower than f2 and the rank y1 is lower than y2 then we have a positive match otherwise we don’t.
Pytorch version: 1.7.3

w = torch.randn(3, 1, requires_grad=True)
softmax = torch.nn.Softmax(dim=0)

def model(x, w):
    z = torch.zeros(1, dtype=torch.float, requires_grad=True)
    z  = z + x @ w
    return z

def custom_loss(p, t):
    #Cross Entropy
    total = torch.zeros(1, dtype=torch.float, requires_grad=True)
    left_comp = (p) * torch.log(t)
    right_comp = (1-p) * torch.log(1-t)
    total = total + -1 * (left + right)
    return total

current_sample_ys = [int(5 * random.random()) for i in range(2)]
current_sample_xs = [[random.random() for i in range(3)] for i in range(2)]

labels = softmax(torch.tensor(current_sample_ys, dtype=torch.float))
features = [ torch.tensor(list(x), dtype=torch.float) for x in current_sample_xs]

m1 = model(features[0], w)
m2 = model(features[1], w)

outputs = softmax(torch.cat((m1, m2), dim=0))
# Turning -1:1 range to 0:1
pred_outputs = (torch.sub(*outputs) + 1) / 2
target = (torch.sub(*labels) + 1) /2

r = custom_loss(target, pred_outputs)

r.backward()

print(w.grad) #None

Hi Phil!

As written, I believe than neither left nor right is defined in your function
custom_loss() (nor elsewhere in the code you posted), so I don’t think
that your code will run at all.

If you fix that (in a sensible way), gradients should backpropagate through
custom_loss().

Best.

K. Frank

This is embarrassing. I’ve been going crazy over this. Thank you Frank!
Any recommendations for a jupyter notebook linter or something of sorts that would have helped me catch this?