Gradients Flow tensorboard error

Good Morning,
I would like to visualize the flow of the network parameters gradients during training using tensorboar.

my code is:

def training(epoch, net, device, criterion, train_data, optimizer,tb):
    """
    Run one training epoch
    :param epoch: Current epoch
    :param net: Network
    :param device: Torch device
    :param criterion: function to evaluate the loss
    :param train_data: Training Dataset
    :param optimizer: Optimizer
    :return:  Average Losses for Epoch
    """
    
    results = 0
    batch_idx = 0
    # change flag of training
    net.train()
    
    for  sample in train_data:        
        
        batch_idx +=1

        # load of the normalized image
        img = sample['image'].to(device)
        
        # load of the ground bounding boxes
        ground_bb = sample['bb'].to(device)
        
        # forward propagation
        pred_bb = net(img)
        
        # evaluate the loss
        loss = criterion(ground_bb,pred_bb)
        
        results += loss.item()
        
        #  gradients are zeroed
        optimizer.zero_grad()
        
        # backward propagation
        loss.backward()
        
        # optimization of the parameters
        optimizer.step()        
        
        for param in net.parameters():
            tb.add_histogram('gradient',param.grad,param)

        logging.info('Epoch: {}, Batch: {}, Loss: {:0.4f}'.format(epoch, batch_idx, loss))
    
    return results/batch_idx

However it returns error. Do you have any ideas on how to display the gradients?

What kind of error are you getting?