What's the difference between batch and batch_size?

Hi,There is a problem I don’t quite understand
In Pytorch,the input shape of RNN is 【seq_len, batch, input_size】or 【batch, seq_len, input_size】,
In Keras, the input shape of RNN is 【batch_size, time_steps, input_dim】,

so,Do batch_size and batch mean the same?I found some different answers on stackoverflow, but it made me more confused.like this:

I think
【the total data = Batch * Batch_size】
just like,in pytorch,the batch_size of【Dataloader】:
image
but I dont know if I am aright.

batch in pytorch LSTM documentation == batch_size in Keras RNN documentation

Thank you for your reply,
I quite agree with you,but,if it is true, Why not write the 【batch_size】 in the LSTM(pytorch)? because the dataloader is written with batch_ size.and in the meaning of dataloader,it seems that batch and batch_size are different.
in other words,