Hi,
I think torch.tensor — PyTorch 1.7.0 documentation and torch.as_tensor — PyTorch 1.7.0 documentation have explained the difference clearly but in summary, torch.tensor
always copies the data but torch.as_tensor
tries to avoid that! In both cases, they don’t accept sequence of tensors.
The more intuitive way is stacking in a given dimension which you can find here: How to turn a list of tensor to tensor? - PyTorch Forums
The problem with your approach is that you convert your tensors to numpy, then you will lose grads and break computational graph but stacking preserves it.
Actually you have posted an answer similar that issue too! How to turn a list of tensor to tensor? - #10 by Brando_Miranda