Is there a sample best-practice project showing pytorch using 100% GPU during training?
I ask because my own project* has 1% GPU usage and in spite of profiling I can’t figure out why. It would help to have a reference project to compare to.
Kind regards,
Gili
- My project uses a combination of GRU and Linear layers to predict a time series. I get low GPU usage even if I hard-code the inputs (trying to get all work off the CPU).