Tensorboard and RNNs

Hi, I’ve been trying to get Tensorboard working for some recurrent models. I want to visualise the graph for a model, but I have this issue where when I try and use .add_graph, TensorBoard returns:

‘RuntimeError: Cannot insert a Tensor that requires grad as a constant. Consider making it a parameter or input, or detaching the gradient’

As far as I can tell, its complaining about the hidden state of the Recurrent unit. I can make the error go away if I tell the hidden state to detach during ‘forward()’, but then the network is unable to train. Besides, the network is able to run and update parameters properly, so I don’t really see why Tensorboard should be unable to create a graph from it. Thanks! Code is below.

import torch
from torch.utils.tensorboard import SummaryWriter
import torch.nn as nn

class Net(nn.Module):
    def __init__(self):
        super(Net, self).__init__()

        self.rec1 = nn.RNN(1, 10, 1, bias=False)
        self.lin1 = nn.Linear(10,1)
        self.hidden = None

    def forward(self, x):

        x, self.hidden = self.rec1(x, self.hidden)
        #self.hidden = self.hidden.detach()
        return self.lin1(x)


if __name__ == '__main__':

    in_data = torch.empty([100, 10, 1])
    tgt_data = torch.empty([100, 10, 1])

    network = Net()
    loss_fcn = nn.MSELoss()
    optimizer = torch.optim.Adam(network.parameters(), 0.0005)

    network.hidden = torch.empty([1, 10, 10])
    out = network(in_data)

    loss = loss_fcn(out, tgt_data)
    loss.backward()
    optimizer.step()

    writer = SummaryWriter('runs/test_icicles')
    writer.add_graph(network, in_data)