How to add 3 Fully Connected layers to a ResNet50 pretrained on Imagenet?

Let me explain myself. I have to train a preTrained (on Imaganet) ResNet50 network, but instead of having one last Fc layer of 1000 I need to add 3 more FC layers (1000->128,128->64 and 64->10). I do this because i want train with a custom dataset (10 classes of my making). My code is this:

resnet50 = models.resnet50(pretrained=True)
resnet50.fc = nn.Linnear(
    nn.Linear(2048, 1000),
    nn.Linear(1000, 128),
    nn.Linear(128, 64),
    nn.Linear(64, 10)
)

But i dont know if I have to use a ReLU() after the last FC (nn.Linear(64, 10)).

Thanks for your help.

Successive linear layers may not be very effective: (i) everything still just a linear operation, and (ii) it adds a lot of parameters. Generally we add differentiable non-linear operations between linear layers (sigmoid, tanh, relu -> which is differentiable almost everywhere, etc).

Regarding the last layer, since you are dealing with a classification problem, you want to get a distribution as output. So, generally the last layer will output the logits, therefore no ReLU would be applied at the end.