Hi everyone.

I am implementing local reparameretization (https://papers.nips.cc/paper/5666-variational-dropout-and-the-local-reparameterization-trick.pdf) and realized that I need somehow a matrix that has the same vector parameter row-wise. Supposing a layer with 512 neurons.

If I code this:

```
bias=nn.Parameter(torch.zeros(512,)).repeat(batch,1)
```

If i now sample from this bias matrix, does pytorch (when performing backward) know that each row is the same parameter?

What I want to do is avoid this:

```
bias=nn.Parameter(torch.zeros(512,))
for i in range(batch):
bias.sample()
```