Different weight initialization for every element of the batch

customized weight initialization can be achieved by model.apply(inialize_weight()) such as inialize_weight() is the build method for this procedure.
however, at the first batch, kernels applied to different batch elements will be identical.
what if we want to initialize different kernels values for each element within the batch.
convnets kernels in PyTorch has the (dim_out, dim_in, kernel_width, kernel_height size) one can see that batch_size does not get involved in the kernel size hence we can confirm that the same kernels will be initially applied to every element of the batch.
a way to bypass this, is to set batch_size to 1, is there another way to achieve that for >1 batch_size?