Partial weight Initialization in a network

I have a model with binary weight values (1s and 0s only).

  1. Is there any way to initialize the weights such that only the ones are initialized?

  2. In order to see how my sparse network performs, I would like to freeze all the zero weight values, so that they are maintained as zero during backprop. Is there any efficient way to do that?