I think that I got lost now. Rewriting an another time the function (in probably a more readable way):
X = Variable(torch.Tensor([[0.6946, 0.1328],
[0.6563, 0.6873],
[0.8184, 0.8047],
[0.8177, 0.4517],
[0.1673, 0.2775],
[0.6919, 0.0439],
[0.4659, 0.3032],
[0.3481, 0.1996]]))
y = Variable(torch.Tensor([1.0, 3.0, 2.0, 2.0, 3.0, 1.0, 2.0, 3.0]))
def customized_loss(X, y):
def similarity_matrix(mat):
a = mat.size()
a = a[0]
simMatrix = Variable(torch.zeros(a,a), requires_grad = True)
for i in xrange(a):
for j in xrange(a):
simMatrix[i][j] = torch.norm(mat[i] - mat[j])
return simMatrix
def convert_y(y):
a = y.size()
a = a[0]
converted_y = Variable(torch.zeros(a,a), requires_grad = True)
for i in xrange(n):
for j in xrange(n):
if y[i] == y[j]:
converted_y[i, j] = 1
return converted_y
X_similarity = similarity_matrix(X)
association = convert_y(y)
loss_num = torch.sum(torch.mul(X_similarity, association))
loss_all = torch.sum(X_similarity)
loss_denum = loss_all - loss_num
loss = loss_num/loss_denum
return loss
loss = customized_loss(X, y)
As far as I can see, everything now is done in Variables (from beginning to the end). We are giving X and y (which are variables) to the function, and then everything is done in Variables. The only other variables that I need to define is simMatrix in the similarity_matrix function, and there I am having this error:
RuntimeError: in-place operations can be only used on variables that don't share storage with any other variables, but detected that there are 2 objects sharing it.
Of course, the same thing happens in convert_y function when I create the converted_y Variable.
And I have no clue, what is going wrong, while googling this error doesn’t show any result.
…
You already spent some time here, so thanks for that, but in case you can guide me how to fix this problem (or writing it if it is a quick fix) it would be awesome. From the pyTorch tutorial about the Variables it is not clear to me what I am doing wrong (haven’t ever used Torch). I guess that the problem is that I am implicitly creating a new Variable in the middle of the graph, but is there any way around it?
-
So, the problem is: If I define simMatrix as Variable we have this problem with sharing storage, if we don’t define it as variable (which wouldn’t make too much sense cause we want its gradients in the backprop) then we also have an error of ‘can’t assign a Variable to a scalar value of type float’ which makes perfect sense.
-
The other problem is that it seems that I cannot compare y[i] with y[j] in convert_y function. Because they are variables they are uncomparable, while if I use y[i].data (which likely makes problems during back-prop), strangely enough it makes a Runtime error saying that ‘bool value of non-empty torch.ByteTensor objects is ambiguous’.
Is there a solution around this?