Slice notation and training


(David Liebman) #1

does using slice notation help or harm pytorch’s ability to train a model? if I use alot of slice notation in my model on my torch.Variables, am I shooting myself in the foot? I was under the impression that in python, when you make a slice of something then you are making a whole new copy. Is this the case in pytorch and does that mess things up?


#2

It shouldn’t mess up anything. Do you see any issues using slicing?
If so, could you provide some examples or code if possible?

Also, since you mentioned Variables: Variables are deprecated since PyTorch 0.4.0, so you can just use torch.tensors now. :wink:


(David Liebman) #3

I have no issues. I’m working on a Module and it’s not training as I would like, so I’m speculating what the reason could possibly be. Thanks for your quick reply.


#4

Maybe there are other bugs in the code?
If possible, you could create a new thread with your code so that we could have a look. :slight_smile:


(David Liebman) #5

I actually posted about my project here:

since that post I have changed the code some, but all in all my results are the same. Quite often my output is ‘I’ alone. If you would look at the post I would be most grateful.