Using set_default_device()
in the following code makes it raise an exception when creating an optimizer, despite looking like a no-op:
import torch
from torch import nn
# 1. Ensure MPS is available
assert torch.backends.mps.is_available(), "MPS device not found"
# Commenting the following out fixes the optimizer creation!
torch.set_default_device("mps")
# 2. Set device
device = torch.device("mps")
# 3. Define a minimal model
model = nn.Linear(10, 1)
# 4. Move model to MPS before creating the optimizer
model = model.to(device)
# 5. Create Adam optimizer *after* moving model to device
optimizer = torch.optim.Adam(model.parameters())
This raises RuntimeError: Placeholder storage has not been allocated on MPS device!
, but everything works fine without the set_default_device()
. Is this expected, or is this a bug?
Getting an answer to this would be useful, because set_default_device()
is convenient, as it seems to avoid doing many .to(device)
!
Some details on the configuration
PyTorch 2.7.0 (conda)
Python 3.12