Interesting use of xtensor with libtorch

I have bumped into this repo:

It uses xtensor for the dataflow from numpy to a C++ tensor to a libtorch tensor:

// transformer
        xt::xarray<int64_t> kinematicTree;
        xt::from_json(m__model["kinematic_tree"], kinematicTree);
        m__kinematicTree = torch::from_blob(,
            {2, JOINT_NUM}, torch::kInt64).to(m__device);// (2, 24)

This could be very helpful, but I wonder why he would need to use xtensor when he is already using libtorch. basically nearly all numpy functions are either directlty implemented in libtorch or can be easily implemented using existing functionalities.
I myself faced some conflicts couple of months ago when I tried to use both of them in a project and ultimately I gave up on xtensor and went full libtorch and I havent looked back.

What was the reason for not using xtensor?