Using Embedding instead of multi-hot vector

I have data where each time step there can be multiple items. I am training RNN with using this data. My dictionary size is around 600 but I will give an example of size 3: a, b, and c.
T1 = a , b , {a,b}, c
T2 = c, a
T3 = {a,b,c}, c
Normally, I am using multi-hot vector which is a sparse vector as input:
T1 = [[1,0,0],[0,1,0],[1,1,0],[0,0,1]]
T2 = [[0,0,1],[1,0,0]]
T3 = [[1,1,1],[0,0,1]]

Instead of using this multi-hot vector I want to use Embedding layer.
My current nn class is:

class Net(nn.Module):

def __init__(self, batch_size,input_size,hidden_size):
    super().__init__()
    self.batch_size=batch_size
    self.input_size=input_size
    self.rnn = nn.RNN(input_size,hidden_size,batch_first=True)
    self.out = nn.Linear(hidden_size, 1)
    self.out_act = nn.Sigmoid()
def forward(self, input_, sizes):
    self.hidden = self.init_hidden()
    packed = pack_padded_sequence(input_, sizes, batch_first=True)
    _ , h_state = self.rnn(packed)
    out = self.out(h_state)
    y = self.out_act(out)
    return y
def init_hidden(self):
    hidden = torch.zeros(1, self.batch_size, self.input_size).cuda()
    return Variable(hidden)

My question is how can I change my Net in a way to use Embeddings instead of multi-hot vector.