Issues about "batch_size"and "normalize"

  • pytorch batch training to achieve the principle: For example, I enter the training data is some one_hot vector, each batch_size = 5, that is, each time you enter five such vectors to the neural network, this time training neural network on these five data Is it read in one by one? Or all read into training.

  • I want to implement a single 2-norm normalization for each dimension.

    Python code:
    ""“Normalization.”""
    x = x / x.norm (dim = -1) [:, None]

The codes above shows my questions: Variable “x” and “x.norm” does not match, and result informs “RuntimeError: inconsistent tensor size”.

Thanks for your reading and I hope for your advice.

  • all read into training i.e
    input one-hot tensor: batch_size x seq_len
    output embedding: batch_size x seq_len x embedding_dim
    it’s calculate in one time.
  • what’s the shape of input x, the code seems fine to me, more clear to use:
    x = x/ ((x.norm(dim=-1).view(-1,1).expand_as(x))

thks!
x is like [torch.FloatTensor of size 4x7] implies that batch_size=4, one_hot tensor length is 7.
AND x = Variable(x)
THEN I want to 2-norm normalize x during forward process.
what should I do?

Actually it works fine in my server:

In [1]: import torch as t

In [2]: t.__version__
Out[2]: '0.2.0_3'

In [3]: x = t.randn(4, 7)

In [4]: x = t.autograd.Variable(x)

In [5]: x = x / x.norm (dim = -1) [:, None]

In [6]: x
Out[6]: 
Variable containing:
-0.2147  0.1313 -0.1385  0.0362  0.0239 -0.7362 -0.6112
 0.3956  0.0211  0.7707  0.3180  0.1106 -0.2142 -0.2998
-0.0607  0.1202 -0.6570  0.4154 -0.6012  0.0698  0.1061
-0.3823 -0.1427 -0.3490  0.0242 -0.2743  0.1059 -0.7903
[torch.FloatTensor of size 4x7]

thanks! my platform is pytorch on WINDOWS.
in my environment:

x = x / x.norm (dim = -1) [:, None]

can inform: RuntimeError: inconsistent tensor size at d:\downloads\pytorch-master-1\torch\lib\th\generic/THTensorMath.c:874

But

x = x/ ((x.norm(dim=-1).view(-1,1).expand_as(x))

it can run successfully!

THANK YOU AGAIN FOR ADVISABALE SUGGESTIONS!