Hi,

I am searching ways to compute the loss using slices. The code below is an example that works, but it has a fixed length (1:30). What I need requires the use of variable length, one for each element of my dataset.

To explain the context, I have a convnet, but one dimension of the cases in my dataset has variable size. I am not sure what I have to do, but I am using pad to equalize the length of my data. I think that would be easier to slice the y_pred and the y after the conv to calculate the loss.

Any ideas how I could do this? Are there better ways to do this?

I can’t transform the data.

```
x = Variable(torch.randn(160,21,60,1))
y = Variable(torch.LongTensor(160,60,1).random_(0, 4), requires_grad=False)
model = Model()
criterion = torch.nn.CrossEntropyLoss()
optimizer = torch.optim.SGD(model.parameters(),lr=0.001,momentum=0.9)
for t in range(500):
y_pred = model(x)
loss = criterion(y_pred.narrow(2,1,30),y.narrow(1,1,30))
print(t, loss.data[0])
optimizer.zero_grad()
loss.backward()
optimizer.step()
```