Let’s say i have this code in tf. I want to rewrite it in pytorch
self.W = tf.v1.get_variable("W",dtype=tf.v1.float32,
initializer=tf.v1.constant(self.params['wordvectors'].astype('float32')))
self.input_support_set_sents = tf.v1.placeholder(tf.v1.int32,
[None, None, sen_len],
'support_set_sents')
self.support_set_sents = tf.v1.nn.embedding_lookup(self.W, self.input_support_set_sents)
So far i have this:
self.W = torch.tensor(config['wordvectors'], dtype=torch.float32)
# input_support_set_sents passed by arg
self.support_set_sents = None
embedding_lookup() in tf basically takes all words from second parameter and returns their emedding valeus from first argument.