Are there any differences between nn.activations and nn.function.activations?

Are there any differences between nn.activations and nn.functional.activations?
Such as class torch.nn.ReLU and torch.nn.functional.relu,the two activation functions seem to be the same, can I use one of them to replace another?

PyTorch activations use the functional API under the hood. So you can safely use one or the other. However, the nn.module additionally implements _repr_ which lets you easily pretty print a summary of your network.