PyTorch Forums
Fine-tune LLMs using torchtune
pparshakov
(pparshakov)
April 30, 2024, 12:05pm
2
Got same error. RTX 3090, torch.cuda.is_bf16_supported() gives True.
show post in topic