Perciever pytorch shape issue

RuntimeError: Given normalized_shape=[29], expected input with shape [*, 29], but got input of size[64, 672, 250]

code:-
model_conv = Perceiver(
input_channels = 3,
input_axis = 2,
num_freq_bands = 6,
max_freq = 10.,
depth = 6,
num_latents = 256,
latent_dim = 512,
cross_heads = 1,
latent_heads = 8,
cross_dim_head = 64,
latent_dim_head = 64,
num_classes = 1000,
attn_dropout = 0.,
ff_dropout = 0.,
weight_tie_layers = False,
fourier_encode_data = True,
self_per_cross_attn = 2
)

#num_ftrs = model_conv.fc.in_features
model_conv.to_logits[2] = nn.Linear(4096, 2)

model_conv = model_conv.to(device)

I guess the error is raised by an nn.LayerNorm layer, which received an input in a wrong shape. Check where this norm layer is used and make sure the activation shape fits the expected shape.

1 Like