Implementing the custom layer : http://pytorch.org/tutorials/beginner/pytorch_with_examples.html#pytorch-custom-nn-modules
This is a very good reference and in fact it was present in one of the links you mentioned.
You basically need to write the implementation of your custom layer in the forward() function.
The weight update would be taken care by the autograd.
I hope this helps.