How to retain updated parameters on the actual model when using Higher?

Hi,

I’m using higher library to develop some meta-learning algorithm.
Briefly, my code looks like the following:

with higher.innerloop_ctx(model, opt) as (fmodel, diffopt):
    for xs, ys in data:
        logits = fmodel(xs)  # modified `params` can also be passed as a kwarg
        loss = loss_function(logits, ys)  # no need to call loss.backwards()
        diffopt.step(loss) 
        // update my hyper-parameters

// update model by using hyperparams learned in the higher context.

My problem is, I’d like the diffopt to update my actual model as well as fmodel. To do so, I’m currently doing another update step after learning my hyperparams in the higher context, and thus using an extra forward-backward propagation.

Is such a thing possible?