Pytorch equivalent of activation=Linear from Keras

What is the equivalent of the activation= linear in Pytorchfrom keras? Thanks

x_out = Conv2DTranspose(output_channels, kernel_size=3, strides=2, padding="same", activation="linear", kernel_initializer="glorot_normal")

It seems that activation.linear means forwarding without activation?

Hey Eta_C, thanks for cofirming.