So I was experimenting with my NN when while change F.leaky_relu(x,0.2)
to F,relu(x)
I accidentally changed it toF.relu(x, 0.2)
. The training progressed fine, but is there any significance of the 0.2?
The 0.2
in F.relu
will be treated as the inplace
argument, which will be interpreted as inplace=True
.
1 Like