Building standalone evaluation script

Hi. I am trying to build a standalone evaluation script using Pytorch Ignite. Yet, I am having trouble getting the desired output.
From the following code:

evaluator = create_evaluator(cfg, model, tokenizer, selected_metrics, logger)

    @evaluator.on(Events.ITERATION_COMPLETED(every=1) | Events.COMPLETED)
    def log_info():
        metrics_output = "\n".join(
            [f"\t{k}: {v}" for k, v in evaluator.state.metrics.items()]
        )

        logger.info(
            f"\nEvaluation time (seconds): {evaluator.state.times['COMPLETED']:.2f}\nValidation metrics:\n {metrics_output}"
        )

    try:
        state = evaluator.run(val_dataloader,)
    except Exception as e:
        logger.exception("")
        raise e

I get the following truncated output:

2022-01-06 16:32:30,537 Test INFO:
Evaluation time (seconds): 0.00
Validation metrics:

…

2022-01-06 16:32:32,628 Test INFO:
Evaluation time (seconds): 0.00
Validation metrics:

2022-01-06 16:32:32,628 Test INFO:
Evaluation time (seconds): 30.94
Validation metrics:
meteor: 0.2809806329259838
ter: 491.0379854211612

As you can see, only when the engine’s run is complete the output is correctly printed. Until then, every thing is zero. How can I properly get the evaluator’s state? Thank you.

Hey @AfonsoSalgadoSousa , according to this - State — PyTorch-Ignite v0.4.7 Documentation
state.times only stores time after an epoch is completed [key: EPOCH_COMPLETED] or after the evaluator has completed validation [key: COMPLETED] like you are using. There is no key to know the time after each iteration in state.
To get the desired output, you can use BasicTimeProfiler - How to do time profiling | PyTorch-Ignite to get the time ITERATION_COMPLETED took.

1 Like

Thank you for the answer. While I now understand that using the engine’s state is not the way, I am still not sure how can I achieve the behaviour I want using Profilers…

@AfonsoSalgadoSousa there are two questions here: 1) why time is not printed and 2) how to print intermediate metric value.

Q1: during the evaluation we typically run a single epoch and evaluator.state.times does not contain per-iteration times, but only epoch and total run. It would be better to use ignite.handler.Timer for iterations.

Q2: to print intermediate metric value we have to compute it per-iteration <=> call metric.compute or metric.completed, see the example below.

Here is an example of how to achieve what you would like :

import time
import torch
from ignite.engine import Engine, Events
from ignite.metrics import Accuracy
from ignite.handlers import Timer

def eval_step(engine, batch):
    return batch


evaluator = Engine(eval_step)

acc = Accuracy()
acc.attach(evaluator, "accuracy")

timer = Timer()

# Compute intermediate metrics
@evaluator.on(Events.ITERATION_COMPLETED)
def compute_and_measure():
    acc.completed(evaluator, "accuracy")
    # added sleep to emulate an amount of time taken by processing a single iteration
    time.sleep(0.1)


@evaluator.on(Events.ITERATION_COMPLETED(every=1) | Events.COMPLETED)
def log_info():
    metrics_output = "\n".join(
        [f"\t{k}: {v}" for k, v in evaluator.state.metrics.items()]
    )
    print(
        f"\nEvaluation time (seconds): {timer.value():.8f}\nValidation metrics:\n {metrics_output}"
    )


bs = 32
n_classes = 10
    
data = [
    [torch.rand(bs, n_classes), torch.randint(0, n_classes, size=[bs, ])]
    for _ in range(8)
]
    
    
state = evaluator.run(data)
print(state.times)

Output:


Evaluation time (seconds): 0.10191662
Validation metrics:
 	accuracy: 0.03125

Evaluation time (seconds): 0.20312049
Validation metrics:
 	accuracy: 0.0625

Evaluation time (seconds): 0.30411697
Validation metrics:
 	accuracy: 0.125

Evaluation time (seconds): 0.40528948
Validation metrics:
 	accuracy: 0.125

Evaluation time (seconds): 0.50620632
Validation metrics:
 	accuracy: 0.125

Evaluation time (seconds): 0.60733969
Validation metrics:
 	accuracy: 0.125

Evaluation time (seconds): 0.70837395
Validation metrics:
 	accuracy: 0.12053571428571429

Evaluation time (seconds): 0.80952449
Validation metrics:
 	accuracy: 0.109375

Evaluation time (seconds): 0.81010838
Validation metrics:
 	accuracy: 0.109375

{'EPOCH_COMPLETED': 0.8085811138153076, 'COMPLETED': 0.8087835311889648}

HTH

1 Like

It surely was helpful. That is it. Thank you very much.

1 Like