Cannot pass custom weights to F.instance_norm

Hi,

When I use F.instance_norm with batch_size = 1 everything runs fine.
But with higher batch size I got some size errors.

Each element of my batch is a person so I want to gives weights to normalize per persons per channels e.g : batch of 3 persons 64 channels and wathever 2D size => weights of size 3,64.

If I get it right instanceNorm is perfect for that but I cannot pass other thing that 64 element to F.instance_norm(weights=…)

I tried (3,64), 192 (3*64), but it only accept 64 elements (which is wrong because I want parameters per channels per batch)

Does anyone know how to do it ?

Thanks in advance,
Pierre

Instance norm does not use per-element weight. It doesn’t make sense as a network layer. You can just do the affine transform yourself after instance norm…

Thanks for the answer,

I misunderstood some concept but after reading again it’s more clear :slight_smile:

You’re solution is working great, I’m just doing the “Adaptative” layer on my own after a classic instance norm.

Thanks again

1 Like

For those who can have the same question :

I’m just doing a classical python instance norm and then I multiply by my own weights to get the adaptive behaviour

out = nn.InstanceNorm(in)
weighted_out = out * w_custom +b_custom