nn.Relu but the dimension is not normal

I have batch x 4096 x 6(time samples) data in nn.
I want to nn.Relu, so I want to reduce the data dimension to
batch x 512 x 6(time samples)
but the nn.Relu layer take the last dimension, if I understand correct.

How can I do a Relu, in the wanted dimension…?

The nn.ReLU layer won’t apply any reduction to the data, but would apply the relu activation on each element as seen here:

batch_size = 2
x = torch.randn(batch_size, 4096, 6)
relu = nn.ReLU()
out = relu(x)
print(out.shape)
> torch.Size([2, 4096, 6])

Depending on what dim1 represents you could reduce it via e.g. nn.Conv1d layers assuming it’s used for the channel dimension.