Why isn't data type an arguement while creating nn modules?

I was looking at code of Layer norm and the tensors are implicitly being assigned float32 data type.
Why has this been done?

Hi,

All module’s parameters are created with the default tensor type torch.Tensor.
If you want to change this, you can either change the default tensor type with torch.set_default_tensor_type() or by changing the type of the layer afterwards with your_layer.double() to convert to double or your_layer.to(new_type).

1 Like

Thanks, didn’t knew about that.