Pack_padded_sequence and tensorboard

I’m having an issue with tensorboard summarywriter and nn.pack_padded_sequence. In the pack_padded_sequence docs it says the ‘lengths’ can be either a tensor(which MUST be on the cpu) or a list object.

Since i’m using a GPU , I use pack_padded_input in my model like so

packed_input = torch.nn.utils.rnn.pack_padded_sequence(embedding_batched_sentences, seq_lengths_clamped.cpu().numpy(), batch_first=True,enforce_sorted=False)

When I run my model without any use of summarywriter it works just fine as expected. No errors at all.

However when I try to add_graph of my model to summarywriter. It checks my forward method and i get the following error

→ 249 _VF._pack_padded_sequence(input, lengths, batch_first)
250 return _packed_sequence_init(data, batch_sizes, sorted_indices, None)
251

RuntimeError: ‘lengths’ argument should be a 1D CPU int64 tensor, but got 0D cpu Long tensor

this error output occurs further down from

→ 286 trace = torch.jit.trace(model, args)

Anyone know what;s going on here?