Why do PyTorch Transformer tutorials typically not have a Decoder?

I’m trying to find a Transformer tutorial with PyTorch that doesn’t use embeddings. I have time-series data, but every tutorial I find (https://github.com/kuberlab-catalog/pytorch-tutorials-code/blob/9e5d67daaa819a24f50bea5f44e1afb490c71092/beginner_source/transformer_tutorial.py) has the Encoder, but ends up with a Linear layer. No decoding. I’m wondering why that is?