I am trying to use pytorch based library “transformers”
When setting the device as “mps” I get the titular error:
Traceback (most recent call last):
File “/Users/raam/code/pytorch_accl/t2v-transformers-models/./app.py”, line 50, in read_item
vector = await vec.vectorize(item.text, item.config)
File “/Users/raam/code/pytorch_accl/t2v-transformers-models/./vectorizer.py”, line 71, in vectorize
batch_results = self.get_batch_results(tokens, sentences[start_index:end_index])
File “/Users/raam/code/pytorch_accl/t2v-transformers-models/./vectorizer.py”, line 52, in get_batch_results
return self.model_delegate.get_batch_results(tokens, text)
File “/Users/raam/code/pytorch_accl/t2v-transformers-models/./vectorizer.py”, line 95, in get_batch_results
return self.model(**tokens)
File “/Users/raam/code/pytorch_accl/.venv/lib/python3.10/site-packages/torch/nn/modules/module.py”, line 1130, in _call_impl
return forward_call(*input, **kwargs)
File “/Users/raam/code/pytorch_accl/.venv/lib/python3.10/site-packages/transformers/models/bert/modeling_bert.py”, line 1010, in forward
embedding_output = self.embeddings(
File “/Users/raam/code/pytorch_accl/.venv/lib/python3.10/site-packages/torch/nn/modules/module.py”, line 1130, in _call_impl
return forward_call(*input, **kwargs)
File “/Users/raam/code/pytorch_accl/.venv/lib/python3.10/site-packages/transformers/models/bert/modeling_bert.py”, line 235, in forward
inputs_embeds = self.word_embeddings(input_ids)
File “/Users/raam/code/pytorch_accl/.venv/lib/python3.10/site-packages/torch/nn/modules/module.py”, line 1130, in _call_impl
return forward_call(*input, **kwargs)
File “/Users/raam/code/pytorch_accl/.venv/lib/python3.10/site-packages/torch/nn/modules/sparse.py”, line 158, in forward
return F.embedding(
File “/Users/raam/code/pytorch_accl/.venv/lib/python3.10/site-packages/torch/nn/functional.py”, line 2148, in embedding
return torch.embedding(weight, input, padding_idx, scale_grad_by_freq, sparse)
RuntimeError: Placeholder storage has not been allocated on MPS device!
looking at the code this line seems to always fail when MPS
[github permalink(https://github.com/pytorch/pytorch/blob/e011a8e18bf469a6a612fd1e7647159c353730a9/aten/src/ATen/native/mps/OperationUtils.mm#L331)
TORCH_CHECK(self.is_mps(), "Placeholder storage has not been allocated on MPS device!");
any advice appreciated,
Thanks!