Pruning or zeroing certain weights

Hello I have a network and I’d like to set a subset of weights to 0. This will save computational time and memory. I’ve seen various questions about this on the message boards but I’m not sure what the best answer is. Some suggest to zero the weights after a propagation step. Others suggest to use a mask.
thoughts?
Thanks Matt

Can you give more specific details on how do you want to choose the subset of weights? Randomly or other criteria? I assume the purpose of this is to help avoid overfitting problems.

  • If you want to do it randomly, then why not using the Dropout? This will also help with the overfitting problem.
  • In theory, you can also use L1 regression which will result in sparse weights, so some of the network weights will become zero.

If the only reason doing this is for computational time, then why not reducing your model size?

Sure. I hope I can explain this clearly.
I have n vectors V that are all observations of the same physical state. Each vector V contains S samples (or features). So I have nS input samples.
Instead of processing all possible connections between n
S samples and nS samples (which would be (nS)^2 number of weights) I would like to process each vector V in the same way and extract k samples from each Vector V. Then I would combine the nk samples.
To put numbers in , lets say S=1000 and n=10.
So originally I have 1000*10 input elements. Instead of generating (10,000)^2 weights, I would like to process each 1000-element vector, get it down to 10 points. Then I could combine the 10 points from each processed vector.
One way to do this would be to set the weights connecting vectors equal to zero.