The problem about gradient vanishing

The gradient backward costly. In some deep network, the gradient may diminish. RestNet prevent the gradient from vanishing in some extent. I want to know if there is an alternative way to ResNet.