Orthogonal vector to the given N dimensional vector using PyTorch

Greetings,

I am new to PyTorch and want to use it in my undergraduate math project. I am working on a project where I need a vector v from the given vector u such that dot product of u and v is zero. If this is a 2 or 3 dimensional problem, it will be very easy. However, in my case the dimension will be arbitrary value (N). How can I get this efficiently using PyTorch ?

I have an approach in mind although I’m not sure if it will be correct. If I generate N-2 random vectors, then I can arrange N-1 vectors (N-2 random vectors plus my vector u) in a matrix (like we do for vector cross product). And then I can simply get the N components of normal/orthogonal vector using alternating determinant values (just like vector cross product). I’m not sure if vector cross product property holds in N dimensions. Also, can someone suggest me a way to speed up the determinant calculation using PyTorch ?

Summarized, my questions are as follows:

  1. How to find normal/orthogonal vector to the given N dimensional vector ? [I’m not sure that a normal vector and orthogonal vector are same things in N dimensional space, I need their dot product to be zero]
  2. If my previously mentioned approach is correct, how can I speed up the determinant calculation process using PyTorch ?

[Mods please let me know if this is the correct place to discuss this. Apologies if it isn’t]

if you just need any orthogonal vector, you can do very simple things like

u = v.clone()
u[-1] = -(v[:-1] @ u[:-1]) / v[-1]