This is because all of the ConstantPadNd functions call torch.nn.functional.pad under the hood without checking the length of the padding, and torch.nn.functional.pad doesn’t have any information about what the nn.Module’s dimensionality is. This might be sorta in the “it’s a feature not a bug” category as it is not technically doing an “invalid” computation:
Thank you for your reply . Maybe there is no limit dimension for torch.nn.functional with the mode='constant', other patterns are dimensionally constrained,like reflect or replicate.