Is column selection differentiable?

Hi all, I have a neural network (Net) with a trainable matrix of parameters in the first layer (matrix \in 100 x dim). The input to the network is an index (idx) to select one row of the trainable matrix to feed other layers. This index value is sent to the network manually each time I call Net (so Net(idx) gives me a vector (1 x dim) that is the idx-th row of the matrix).
Now, I am trying to compute the value of idx instead of setting it manually. What I can think of, is to have another neural net (e.g. net-2) with a sigmoid on top. The sigmoid generates a value in the range [0,1], then I can map that value to [0,99] and trigger Net with that, such as this -> Net(mapper(sigmoid_net-2)).
My question is, how can I make the entire process trainable/differentiable? I have a mapper which can break the gradient flow. Also, I am not sure if column selection is differentiable.

Thanks in advance for your help.