How to forward and backward non-tensor data?

Most deep learning frameworks can forward and backward tensor data or forward objects that can be serialized into a tensor. If I want to forward (and possibly backward) some complicated custom data structure that is written in c/c++ and that cannot be easily serialized into a flat memory, how can I do that?

In Caffe this is easy, because I can put the custom data into a member variable of a Layer class and then forward the address of the object. Every layer instance is (sequentially) called only once in a forward pass of a graph, so every thing is fine. In backward pass I can access the same object, and modify it to accumulate non-tensor gradient if needed.