Converting Tensorflow Code to PyTorch by Removing Placeholders

Hi. I know that placeholders in Tensorflow are used to define a variable which will have data assigned to it later on, but they cannot be directly evaluated.
I was trying to convert the following code to PyTorch: CVAE/model.py at master · jaechanglim/CVAE · GitHub
In the code they define placeholders like this:
self.X = tf.compat.v1.placeholder(tf.int32, [self.batch_size, None])
self.Y = tf.compat.v1.placeholder(tf.int32, [self.batch_size, None])
self.C = tf.compat.v1.placeholder(tf.float32, [self.batch_size, self.num_prop])
self.L = tf.compat.v1.placeholder(tf.int32, [self.batch_size])
These placeholders are used before data is assigned to them such as here:
weights = tf.sequence_mask(self.L, tf.shape(self.X)[1])
Then data is finally assigned to them such as here:
X = tf.nn.embedding_lookup(self.embedding_encode, self.X)
I have read that you pass tensors directly to modules in PyTorch and placeholders are not used, but what exactly does this mean? Also, what changes would I make to this to convert it into PyTorch so that I don’t declare a placeholder for self.X, but am able to still use it? I would appreciate any clarification and assistance on how I could do this. Thanks!

You are right that PyTorch does not use placeholder variables and allows you to directly use tensors.
I.e. instead of using this logic:

# initialize placeholders with shape etc.
var1 = placeholder(...)
var2 = placeholder(...)

# set values to place holder
var1.load_data(...)
...

# execute model graph
...

you can directly use the tensors as in e.g. numpy but will get the benefit of Autograd and can execute your workload on different devices such as the GPU:

# initialize input with random values
x = torch.randn(shape)
# or load as an image
img = PIL.Image.open(...)
x = torch.from_numpy(img)
# or load from a numpy array
arr = np.load(...)
x = torch.from_numpy(arr)
...

# execute model
out = model(x)

# now you can directly print the output
print(out)

Thanks for the reply. I have one question: In Tensorflow, the placeholders sometimes use None, meaning that the size of that dimension could be any number. Would I be able to do anything like that in PyTorch or would it not be necessary?

You don’t need to define “dynamic” and “static” axes and can just create the tensors in your desired shape.