Hi all,

Is there a way to re-assign the batch_size of a tensor?

For example: if I have a tensor `x`

of shape `[1,200,1,1]`

, is there a way like `x.shape[0] = 20`

So it becomes of shape `[20,200,1,1]`

Any help/suggestion with that, please?

Hi all,

Is there a way to re-assign the batch_size of a tensor?

For example: if I have a tensor `x`

of shape `[1,200,1,1]`

, is there a way like `x.shape[0] = 20`

So it becomes of shape `[20,200,1,1]`

Any help/suggestion with that, please?

Hi,

What do you expect this new Tensor to contain?

when you say random you mean truly random or undefined values?

The simplest would be to just re-generate it with `x = torch.randn(new_size)`

.

@albanD I understand you, but no

this random tensor are not directly generated with `torch.randn`

(it come after a number of calculations of another tensor been generated with `torch.randn`

But if you want 20 times more samples (as in the example you gave initially), you need to call your code that generates these numbers with the bigger input to get more random numbers.

Or do you want 20 times the same samples?

Hoooo

`x.expand([20,200,1,1])`

then