Hi,
I wrote a custom linear layer that allows the use of complex value-data as such:
Custom layer
def apply_complex(fr, fi, input, dtype = torch.complex128):
return (fr(input.real)-fi(input.imag)).type(dtype) \
+ 1j*(fr(input.imag)+fi(input.real)).type(dtype)
class ComplexLinear(nn.Module):
def __init__(self, in_features, out_features):
super(ComplexLinear, self).__init__()
self.fc_r = nn.Linear(in_features, out_features, dtype=torch.float64)
self.fc_i = nn.Linear(in_features, out_features, dtype=torch.float64)
def forward(self, input):
return apply_complex(self.fc_r, self.fc_i, input)```
However, at some point, I want to access the weights of the custom linear layer (head
) using the code: wm = head.weight.data
(it works for nn.Linear case). Is the equivalent just doing wm = head.fc_r.weight.data + 1j * head.fc_i.weight.data
or I should define a weight parameter in the custom layer that does the same?