Layer's output is inconsistent with next layer's input_value

Steps to reproduce:

  1. Load an alexnet from torchvision
  2. Change model’s first weight to torch.tensor(float(‘inf’))
  3. Register forward hook and check the output of first conv layer and first ReLu layer
    I speculate this bug was caused by torch’s measure to inf value
    Screen Shots:
    Conv Layer’s output and another parameter:

    ReLu Layer’s output and another parameter:

    Torchinfo’s summary of AlexNet

    Note: input_val was a tuple

PS: Here’s what happened when I change first weight to 999999


OK this seems not caused because of inf value

So I’m really confused

Update: I find that all layer’s input data don’t value less than 0
It’s like conv’s output was relued before relu layer

AlexNet uses inplace nn.ReLU layers as seen here, which would explain your observation.

1 Like

Woo, it means output and input of relu layer should be the same if I hook after this layer was completed its work.
You were so nice.
Thank you!