I’m usually using a dirty trick when I do that kind of thing. In your case, don’t change the resnet18 class (or change it to be compatible with the pretrained arg), and then it would be something like:
old_conv_weight = model.conv1,weight.data #get old weights
new_conv = nn.Conv2d(10, 64, kernel_size=7, stride=1, padding=3, bias=False) #create new conv layer
nn.init.xavier_normal_(new_conv.weight) #xavier init
new_conv.weight.data[:,:3].copy_(old_conv_weight) #copy old weights into first 3 channels
model.conv1 = new_conv #replace old conv with the new one