Relu parameters

Does relu have any parameters? My understanding is that relu function (relu = max(0, x)) just pick a value between 0 and x and has no parameters involved. But in this pytorch official tutorial
Deep Learning with PyTorch: A 60 Minute Blitz
when one prints the parameters of the model

params = list(net.parameters())
print(len(params))

and get parameters from the relu function. Can anyone explain?

They shouldn’t be in there, and in the tutorial you’ll get a list of 10 parameters.
These are two parameters for each of the five layers (weight and bias).

Thank you it is very helpful @ptrblck