Is the index_add_ function differentiable?

    verts_normals = torch.zeros_like(cornea_vertex)
    vertices_faces = cornea_vertex[face_index]

    faces_normals = torch.cross(
        vertices_faces[:, 2] - vertices_faces[:, 1],
        vertices_faces[:, 0] - vertices_faces[:, 1],
        dim=-1,
    )
    unit_faces_normals = safe_normalize(faces_normals)
    verts_normals.index_add_(0, face_index[:, 0], unit_faces_normals)
    verts_normals.index_add_(0, face_index[:, 1], unit_faces_normals)
    verts_normals.index_add_(0, face_index[:, 2], unit_faces_normals)

The function outputs verts_normals , and unit_faces_normals is differentiable. Will the above operation prevent overall gradient backpropagation?

yes, index_add_ is differentiable