I am trying to reproduce this code snipped from PyTorch.When I pass the input to the model it returns the following warnings. Will it effect my outputs ?
Warning
[MAdd]: Dropout is not supported!
[Flops]: Dropout is not supported!
[Memory]: Dropout is not supported!
[MAdd]: LayerNorm is not supported!
[Flops]: LayerNorm is not supported!
[Memory]: LayerNorm is not supported!
I haven’t seen these warnings before. Could you post a complete, minimal and executable code snippet as well as the entire output log including the stacktrace, please?
import numpy as np
import torch
from torch import nn
from torchinfo import summary
decoder_layer = torch.nn.TransformerDecoderLayer(d_model=128, nhead=8,dim_feedforward=256)
Decoder = torch.nn.TransformerDecoder(decoder_layer, num_layers=1)
src = torch.rand(1, 128)
tgt = torch.rand(1, 128)
out = Decoder(tgt,src)
This code generates a vector of [1x128] with the following messages:
[MAdd]: Dropout is not supported!
[Flops]: Dropout is not supported!
[Memory]: Dropout is not supported!
[MAdd]: LayerNorm is not supported!
[Flops]: LayerNorm is not supported!
[Memory]: LayerNorm is not supported!
In that case I would recommend to try to update to the latest release of torchinfo and to check if these warnings are still raised. If so, you might want to create a GitHub issue in their repository.