Computing every iteration, every 20 epochs


What if I wanted to compute something on every batch output (so every iteration), but I only wanted to do this every 20 epochs? How could I do this? Is there a sort of “detach_event_handler” as a counterpart to attach_event_handler? Or is there some other way to do this? Thanks!

I might have misunderstood your question but could you not just use:

if(epoch%20 == 0):

I think I didn’t explain well my question.

Consider the following:

trainer.add_event_handler(Events.ITERATION_COMPLETED(every=1), foo)

I want to run foo() on every iteration, but only every 20 epochs.

I cant just do:

if (epoch%20==0):

trainer.add_event_handler(Events.ITERATION_COMPLETED(every=1), foo)

Because the first time epoch gets to 20, the handler will be attached to the trainer and then foo() will be called on every single iteration from then on, whether epoch%20 or not.

@pytorchnewbie I think the point was about using if epoch %20 ==0 inside foo which is one of possible implementations.

If you would like to setup customized event filtering other than every, please see the usage of event_filter argument :

either this.
Or, as far as I understand you want to call foo() on every iteration of every 20th epoch right?
So this:

for epoch in range(n_epochs):    #your epoch loop
    for iteration, data in enumerate(your_dataloader, 0):     #your iteration loop
        if((epoch+1)%20 == 0): 
            foo() #function gets called on every iteration of every 20th epoch

should work.
@pytorchnewbie Is this not the behavior you asked for?

@RaLo4 The question is marked with “ignite” category and about how to do that with PyTorch-Ignite