Does relu have any parameters? My understanding is that relu function (relu = max(0, x)) just pick a value between 0 and x and has no parameters involved. But in this pytorch official tutorial
Deep Learning with PyTorch: A 60 Minute Blitz
when one prints the parameters of the model
params = list(net.parameters())
print(len(params))
and get parameters from the relu function. Can anyone explain?