Hi, I found numba libarary helps run code faster than usual python code. I am wondering, to make the training time faster, does Numba works for pytorch? If not, do you have any alternative to make it faster in terms of speed. Thanks
Hi, I understand Numba does some kinds of precompilations for python code making in it faster. In that sense it should work fine with pytorch, but maybe is not very helpful as pytorch already wraps a lot of critical operations in C++.
For inference or evaluation during training, make sure you use with torch.no_grad()
. Or avoid any other unnecessary operations during the model evaluation.
Daniel Gonzalez
Embedded SW Engineer at RidgeRun
You could have a look into torch.compile
?