You could set the seed using torch.manual_seed before running these operations. Note however that seeding the code before applying a random operation (such as using dropout) would use the same mask for the same input shape, so this operation isn’t “random” anymore.
You cannot switch between deterministic and random behavior easily, since all operations using the pseudorandom number generator would use the already set seed.
E.g. if you set the seed in operation 1 (transformer dropout), the same drop mask will be samples, but also the following operations using the PRNG would use the same “random” numbers unless you reseed the code to a new (random) seed.