F.CROSS_ENTROPY weight parameter does not seem to have an effect

I guess your target might contain the same target labels for all samples in the batch, which would then remove the weighting during the normalization step.
This post shows you how the weighting is implemented internally and as you can see, the weighted loss will be normalized with the sum of the weights in the last step. If all labels are equal, weighting won’t have any effect.