Loss and Gradients Fluctuating very rapidly

I am using a 3 layered neural net with a single input and single output I am using loss.backward() and then updating the weights but my loss fluctuates very rapidly and it never converges, However I am pretty sure that my loss function has a global minima.

It is very difficult to help you based on the abstract description that you provided in your post. This is just because there are literally millions of code scenarios that could show the behaviour that you describe, and since your description is so vague, we have no clue which of these scenarios apply to you.

You will have a far better chance of getting help with your problem if you provide the following (more concrete) information:

  • Complete, executable code which exhibits the behaviour that you want to address
    • Ideally, this should include just enough code to demonstrate the points that you want us to address, while being executable. The fewer the lines of code involved, the easier (usually …) it is for others to read, understand, and debug it.
    • If possible, try to come up with code that takes in random tensors of the proper shapes as input, instead of actual data (such as images or text from the real world).
  • A description of what you expected from this code, and what you actually saw.