@ptrblck I think i want to do something like this except I am unsure how to add self-attention to intermediate resnet blocks
also, for self-attention, there are so many options. Is there any self-built module within pytorch that you would suggest?
I found these alternatives, what is your take?