Am I permitted to use functions such as `torch._foreach_add`

Before posting a query, check the FAQs - it might already be answered!

This function torch._foreach_add is not mentioned anywhere, like on documentation, etc, except a few of foreach functions are mentioned in the Foreach section of this page torch — PyTorch 2.3 documentation, but most like _foreach_add are not mentioned there either, so I am unsure whether I am allowed to use them, or if they are subject to change, etc. I do notice that the official optimizer implementations use them, and as I understand, they are a bit faster than doing for loops. So is it okay if I use that?

These methods are tagged as internal/private indicated by the underscore at the front of the function names, since their interface can easily change and break.
You can of course use these, but should not depend on backwards compatibility and might need to change some calls in future versions if their interfaces change.