Modifying data in batch

Hi all,

I have a use case where I need to optimize over dual variables of the size of my dataset. Namely, I would like to load (input, duals, targets) at each batch, and then optimize both over duals and my model’s parameters. My loss function takes (model(input), duals, targets) as input. At the end of the optimization, I would like to update duals in my dataset.

How can I do this?

Thanks!

1 Like

I don’t understand your use case clearly, especially the dual variables of the size of the dataset.
If the dual variable you mentioned is learnable, you could set requires_grad=True and add them to the optimizer.
Does this method fit your use case?

I’m optimizing the dual of a convex optimization problem, that yields an unconstrained optimization problem of the above form. My dual variable is of size n_data x m since originally I had n_data x m constraints in my primal constrained convex problem and each row of my dual variable corresponds to a specific datapoint. I would prefer batching these variables with the datapoints to allow for easy shuffling.