I just watched an MIT discussion on Liquid Neurons and Liquid Networks. They took 10,000 neurons in an FC layer and replaced with 19 liquid neurons connected to 2 more layers with fewer liquid neurons each. The math function for the liquid neurons is defined in their work. Would a new nn class type be implemented as a LiquidNeuralNetwork as opposed to the Linear type for example? Would it be fairly easy to override existing classes and substitute the formulas?
Here is the video presentation on it Liquid Neural Networks, A New Idea That Allows AI To Learn Even After Training - YouTube
1 Like
Is anyone working on implementing this today in PyTorch ?
You would just need to translate this LTCCell definition from Tensorflow to Pytorch.
ChatGPT or some other code LLM could probably handle most of the heavy lifting for you, and then some minor debugging.
It seems like a pytorch implementation of liquid NN layers already exists: