Nn.lstm can't take torch.LongTensor as input

Hi, I was trying to give python nested list as nn.LSTM’s input. I have converted the list to LongTensor. My code is as follows. May I ask if it is the case that nn.LSTM can only take FloatTensor as input but never LongTensor? Thank you.

import torch
import torch.nn as nn
from torch.autorgrad import Variable

size_vocab = 5
hidden_size = 5
seq_len = 10

lstm = nn.LSTM(size_vocab, hidden_size)

inputs = Variable(torch.randn(seq_len,1,size_vocab)
out, hidden = lstm(inputs, None) # this works

one_hot = [[0] * size_vocab] * seq_len
inputs2 = Variable(torch.LongTensor([one_hot]).transpose(0,1)
out2, hidden2 = lstm(inputs2, None) # this won't work

yes, nn.LSTM (and all of nn.*) cannot take LongTensor as input. (the only exceptions are that nn.Embedding takes IntTensor or LongTensor as inputs (because they are index lookups), and some loss functions which are around classification take LongTensors as the targets).