Pipelines loaded with
Did you try removing torch_dtype=torch.float16
in your model creation as explained in the error message?
where would i find this file so i can remove it having the same problem
torch_dtype=torch.float16
cannot run with cpu
device. It is not recommended to move them to cpu
as running them will fail. Please make sure to use an accelerator to run the pipeline in inference, due to the lack of support forfloat16
operations on this device in PyTorch. Please, remove the torch_dtype=torch.float16
argument, or use another device for inference.Did you try removing torch_dtype=torch.float16
in your model creation as explained in the error message?
where would i find this file so i can remove it having the same problem