I’m trying to create a simple 10-layer neural network to classify MNIST. I am constrained, however, because I need the weight matrices to be toeplitz matrices (each diagonal in the matrix must be constant). Is there a way to enforce this condition using the Linear class? If not, how might I go about making a new subclass of nn.module that would allow me to model this?

I’m new to Pytorch, so apologizes if this is a silly question!

If I understand your use case correctly, you would like to use a Toeplitz matrix as the `weight`

parameter in all linear layers and would like to keep the constraint during the training?

Would each layer use a square matrix via `in_features=out_features`

so that the trainable parameters would be `n+1`

or any shape?

I think one valid approach could be to initialize the weight as an `nn.Parameter`

in your custom linear layer implementation, create the Toeplitz matrix during the `forward`

pass, and apply the operation using the functional API via `F.linear(input, weight)`

.