Add_scalars tensorboard

I was wondering the reason why add_scalars warns about exploiding memory. What’s the point of saving scalars in RAM? planning to let this to be optional?

    def add_scalars(self, main_tag, tag_scalar_dict, global_step=None, walltime=None):
        """Adds many scalar data to summary.

        Note that this function also keeps logged scalars in memory. In extreme case it explodes your RAM.

EDIT:
is it possible to use add_scalars passing values at different times?
Rewriting the question
provided example sends a single dictionary with 3 values. Is it possible to send 3 dictionaries to the same timestap with different subtags getting them properly plotted?