Loss.backward() failure due to contiguous issue

If that comes from here, if you just run the previous layer and this one, l10, does that trigger the same error? :slight_smile: