Training the lstm forVariable length and want to ensure the backpropagation

I am using packed sequence for variable length input sequences for using dataloader and using packpadsequence for giving the input to rnn , do I need to pad packed sequence for backpropagation . How can I ensure that the backprogation is correct!!

train_dataloader = torch.utils.data.DataLoader(train_event_loader, batch_size=2, shuffle=False,collate_fn=my_collate, num_workers=0)

def my_collate(batch):
# batch contains a list of tuples of structure (sequence, target)
data = [item[0] for item in batch]
data = pack_sequence(batch, enforce_sorted=True)
targets = [item[1] for item in batch]
return [data, targets]

packed_seq_batch = torch.nn.utils.rnn.pack_padded_sequence( data,
lengths=seq_lens, batch_first=True)

output, (hn, cn) = lstm(packed_seq_batch.float())

###Do I need to do this step for backprop or shall I skip
padded_output, output_lens = torch.nn.utils.rnn.pad_packed_sequence(output, batch_first=True, total_length=5)

Loss()