Cogent Confabulation neural network

Hi all…I’m new to the forum and have a basic question.

Context:

I’m interested in using PyTorch to develop a new type of neural network. It is based on the theory of Cogent Confabulation by the late Robert Hecht-Nielsen. I recently published a paper in the Neural Networks journal showing improved entity recognition using a measure of ontology cogency that I developed (my PhD dissertation results). Not asking anyone to explore the theory of confabulation to respond to my question, just providing context. But, if interested, please see https://en.wikipedia.org/wiki/Confabulation_(neural_networks)

Question related to PyTorch:

I’m new to PyTorch, and don’t quite know where to start, so I’m looking for ideas on PyTorch customization.

Basically I want to develop a custom recurrent layer where the output of the layer is the product of conditional probabilities.

Right now the conditional probabilities are computed from a corpora and ontology prior to training a neural network. To keep life simple, let’s assume that will not change.

How does one go about developing a customer layer in PyTorch that computes the product of the conditional probabilities?

Thanks in advance for your advice.