What is dropout in convolutional layers and how does that different from max-pooling-dropout?

When dropout is applied to fully connected layers some nodes will be randomly set to 0.

It is unclear to me how dropout work with convolutional layers. If dropout is applied before the convolutions, are some nodes of the input set to zero?

If that so how does this differ from max-pooling dropout? Even in max-pooling-dropout some elements in the input are randomly dropped (Set to zero). So the idea of max-pooling dropout is same as dropout in convolutional layers?