Tensors as arguments

Hello, I am new to Pytorch so sorry for any naiveté or ignorance. I am doing a little bit of experimenting replicating the functionality of distributions in tensorflow.contrib. While working on a sample generator for random normal, I tried to pass in first erroneously a torch.Size type to randn and then after converting this to tensor, still received an error. So the args supplied to randn need to be integers then.

I notice that a lot of tensorflow arguments are tensors. Let’s say I store or derive through a function call the shape or size of my normal distribution parameters as regular integers. Is there possibly a performance hit that can take place when things get more complex by having to derive and pass in integers instead of working with tensors as arguments?

That might be a really terrible question, so I am sorry if so :smile:

passing in a few arguments is unlikely to have any performance impact :slight_smile:

x = torch.randn(10, 20)
y = torch.randn(*x.size()) # should work