How to get tensor slices?

If I have a tensor like this
tensor([[0.3843, 0.3552]],
[[0.1743, 0.2439]],
[[0.3474, 0.1554]],
[[0.1325, 0.0778]],
[[0.2280, 0.1096]],
[[0.3842, 0.3997]],
[[0.1599, 0.0283]],
[[0.1876, 0.0815]]])
How could I get a tensor from him like that:
tensor([[0.3843]],
[[0.1743]],
[[0.3474]],
[[0.1325]],
[[0.2280]],
[[0.3842]],
[[0.1599]],
[[0.1876]]])

You can directly index tensors via tensor[] as e.g. in numpy:

x = torch.tensor([[[0.3843, 0.3552]],
                [[0.1743, 0.2439]],
                [[0.3474, 0.1554]],
                [[0.1325, 0.0778]],
                [[0.2280, 0.1096]],
                [[0.3842, 0.3997]],
                [[0.1599, 0.0283]],
                [[0.1876, 0.0815]]])

print(x[:, :, 0])

This code is correct, assuming that inputs is all the instances of the training, i.e. there is only 1 batch in trainloader?

for epoch in range(n_epochs):

    for i, [inputs, labels] in enumerate(trainloader):
        inputs = inputs.to(device)
        labels = labels.to(device)

        # Forward + backward + optimize
        outputs = net(inputs.float())

        if epoch == 1:
            optimizer = torch.optim.Adam(net.parameters())

        optimizer.zero_grad()
        
        loss = criterion(outputs, labels.float())
        loss.backward()
        optimizer.step()

I would create the optimizer before entering the training loop.
Besides that it looks alright.

I’m not sure which criterion you are using, but I assume your criterion expects float targets.

I can’t put it in before because the net is built during the forward, and I get this error ValueError: optimizer got an empty parameter list.

This error is raised, if you didn’t register the parameters properly.
Could you post the model code, please?