Hello everyone.
This is a followup question concerning this . The issue is in the Resnet model that I’m dealing with, I cant replace PReLU with ReLU as it drastically affects the network performance.
So my question is, what are my options here? what should I be doing in this case?
Would doing sth like this suffice?
and the diff is calculated like this in the forward pass:
def forward(self, x):
residual = x
out = self.bn0(x)
out = self.conv1(out)
out = self.bn1(out)
out = self.prelu(out)
out2 = self.prelu2(out)
print(f'diff : {( out - out2).mean().item()}')
out = self.conv2(out)
This is the normal implementation which I used on ordinary model (i.e. not quantized!) to assess whether it produces correct result and then move on to quantized version:
OK, I figured it out! I made a huge mistake in the very begining. I needed to calculate
PReLU(x)=max(0,x)+a∗min(0,x)
or
and not the actual min! or max! which doesnt make sense!
now, can anyone do me a favor and tell me how I can vectorize this ? I’m kind of lost at the moment!