Is it common for some layers to share weight with other?

For instance is it common for some convolution layer to share weight with some other conv layer?