Dropout drops certain activations stochastically (i.e. a new random subset of them for any data passing through the model). Typically this is undone after training (although there is a whole theory about test-time-dropout).
Pruning drops certain weights, i.e. permanently drops some parts deemed “uninteresting”.