Graph in Docs for LeakyReLU wrong

The graph used in the docs available here
https://pytorch.org/docs/stable/generated/torch.nn.LeakyReLU.html

image

is the graph of the regular ReLU function. The slope of the graph for x<0 should be non-zero. Is there somewhere to report this so it can be fixed?

I don’t think the plot shows the plain ReLU, but a LeakyReLU using the default negative_slope value of 1e-2.
You can zoom in a bit and would see the tiny slope:

To reproduce the figure, run:

act = torch.nn.LeakyReLU()
x = torch.arange(-7, 7, 0.01)
out = act(x)

f = plt.figure()
plt.xlim([-7, 7])
plt.ylim([-7, 7])
plt.grid()
plt.plot(x, out)

which will create:
image

To better see the slope, change the default value:

act = torch.nn.LeakyReLU(negative_slope=0.5)
x = torch.arange(-7, 7, 0.01)
out = act(x)

f = plt.figure()
plt.xlim([-7, 7])
plt.ylim([-7, 7])
plt.grid()
plt.plot(x, out)
1 Like

Wow ok my mistake! I didn’t look carefully enough, given the small slope.
Thanks a lot Peter.

1 Like