Autograd: Add VJP and JVP rules for aten::aminmax #151186

Adds functionally correct backward (VJP) and forward (JVP) autograd rules for the aten::aminmax operator to derivatives.yaml using existing helper functions. This ensures correct eager mode differentiation.

It has been open for some time. How to move forward with is PR?