Time measurement of each layer?

Hi, Can anybody tell me how to use tensorboard module in pytorch to measuring inference time of each layer?
I appreciate any reply. Thank you so much~

I think, to measuring inference time for each layer you need to use

I am going to define my model:

import torch

class TheModel(torch.nn.Module):
    def __init__(self):
        super().__init__()
        # Get a resnet50 backbone
        self.layer1 =  torch.nn.Linear(5,4)
        self.layer2 =  torch.nn.Linear(4,3)
        self.layer3 =  torch.nn.Linear(3,2)
        self.layer4 =  torch.nn.Linear(2,1)
        self.layerM =  torch.nn.Linear(4,2) #Middle Layer (from layer1_out to layer4_in)

    def forward(self, x):
        x1 = self.layer1(x)
        x2 = self.layer2(x1)
        x3 = self.layer3(x2)  + self.layerM(x1) # add layer 
        x4 = self.layer4(x3)        
        return x4

Then you need to define yours hook functions:

import time
## Define hook functions
take_time_dict = {}

def take_time_pre(layer_name,module, input):
    take_time_dict[layer_name] = time.time() 

def take_time(layer_name,module, input, output):
    take_time_dict[layer_name] =  time.time() - take_time_dict[layer_name]
    ## for TensorBoard you should use writter

In this case, I’m creating a dictonary to save the time Prehooks and hooks.
In normal case for define a hook function you need to define module, input and output. (for prehooks you don’t need to define and output). For this case I’am introducing layer_name variable that We will need this variable to save in which layer the calculation is being made.

In the iteration we define the partial function for each layer

from functools import partial

# Create Model
model = TheModel()

# Register function for every 
for layer in model.children():
    layer.register_forward_pre_hook( partial(take_time_pre, layer) )
    layer.register_forward_hook( partial(take_time, layer) )

x = torch.rand(1,5)
model(x)  

And show the result of one inference

take_time_dict
# {Linear(in_features=2, out_features=1, bias=True): 4.100799560546875e-05,
# Linear(in_features=3, out_features=2, bias=True): 3.600120544433594e-05,
# Linear(in_features=4, out_features=2, bias=True): 4.744529724121094e-05,
# Linear(in_features=4, out_features=3, bias=True): 5.53131103515625e-05,
# Linear(in_features=5, out_features=4, bias=True): 0.00024390220642089844}

This is a small example, so that later you can add it to the tensorboard writer and add more complexity to the function

Colab Link

2 Likes

thanks a lot!!!
sorry for late reply