Torchrl - Error by creating an environment with batch_size

Hello there!,

I have a question about torchrl. I did the tutorial ‘Pendulum: Writing your environment and transforms with TorchRL → Pendulum: Writing your environment and transforms with TorchRL — torchrl main documentation’ and noticed that it is possible to initialize many environments in parallel in a rollout object but for that it is necessary to define the batch_size (= number of environments) inside a rollout object. Is it possible to define the batch_size for the environment itself? I would appreciate your help!

*In the tutorial it looks like this:

'# Executing a rollut with a batch of data requires us to reset the environment out of the rollout function,

since we need to define the batch_size dynamically and this is not supported by ‘rollout()’

rollout = env.rollout(
47,
auto_reset=False, # we’re executing the reset out of the ‘rollout’ call
tensordict=env.reset(env.gen_params(batch_size=[batch_size])),
)
print(“rollout of len 3 (batch size of 10):”, rollout)’

  • and I want something like this (and obtain the same results as above):

‘env.batch_size = [batch_size])
env.rollout(47)’

  • I get the following error:

'ValueError Traceback (most recent call last)
Cell In[8], line 3
1 env = PendulumEnv(device=“cpu”)
2 check_env_specs(env)
----> 3 env.batch_size = [7]

File ~/miniconda3/envs/vmas/lib/python3.9/site-packages/torchrl/envs/common.py:397, in EnvBase.setattr(self, key, value)
384 if key in (
385 “_input_spec”,
386 “_observation_spec”,
(…)
391 “_done_spec”,
392 ):
393 raise AttributeError(
394 “To set an environment spec, please use env.observation_spec = obs_spec (without the leading”
395 " underscore)."
396 )
→ 397 return super().setattr(key, value)

File ~/miniconda3/envs/vmas/lib/python3.9/site-packages/torch/nn/modules/module.py:1747, in Module.setattr(self, name, value)
1745 buffers[name] = value
1746 else:
→ 1747 super().setattr(name, value)

File ~/miniconda3/envs/vmas/lib/python3.9/site-packages/torchrl/envs/common.py:442, in EnvBase.batch_size(self, value)
437 if (
438 hasattr(self, “output_spec”)
439 and self.output_spec.shape[: len(value)] != value
440 ):
441 self.output_spec.unlock_()
→ 442 self.output_spec.shape = value
443 self.output_spec.lock_()
444 if hasattr(self, “input_spec”) and self.input_spec.shape[: len(value)] != value:

File ~/miniconda3/envs/vmas/lib/python3.9/site-packages/torchrl/data/tensor_specs.py:605, in TensorSpec.setattr(self, key, value)
603 if key == “shape”:
604 value = torch.Size(value)
→ 605 super().setattr(key, value)

File ~/miniconda3/envs/vmas/lib/python3.9/site-packages/torchrl/data/tensor_specs.py:3334, in CompositeSpec.shape(self, value)
3332 if isinstance(spec, CompositeSpec):
3333 if spec.shape[: len(value)] != value:
→ 3334 spec.shape = value
3335 elif spec is not None:
3336 if spec.shape[: len(value)] != value:

File ~/miniconda3/envs/vmas/lib/python3.9/site-packages/torchrl/data/tensor_specs.py:605, in TensorSpec.setattr(self, key, value)
603 if key == “shape”:
604 value = torch.Size(value)
→ 605 super().setattr(key, value)

File ~/miniconda3/envs/vmas/lib/python3.9/site-packages/torchrl/data/tensor_specs.py:3337, in CompositeSpec.shape(self, value)
3335 elif spec is not None:
3336 if spec.shape[: len(value)] != value:
→ 3337 raise ValueError(
3338 f"The shape of the spec and the CompositeSpec mismatch during shape resetting: the "
3339 f"{self.ndim} first dimensions should match but got self[‘{key}’].shape={spec.shape} and "
3340 f"CompositeSpec.shape={self.shape}."
3341 )
3342 self._shape = torch.Size(value)

ValueError: The shape of the spec and the CompositeSpec mismatch during shape resetting: the 0 first dimensions should match but got self[‘th’].shape=torch.Size() and CompositeSpec.shape=torch.Size().’