Why is input used as var name in examples and pytorch convention!

Most torch examples have the input tensor assigned to a variable named input. This breaks the inbuilt python function of the same name.

A discussion is needed for adopting common python specific convention to pytorch.

Are you going to be using the input builtin in the middle of your model? I doubt it. It’s a stupid name to reserve by a language, and I don’t think it’s ever going to be used in this context by anyone, so I’m 100% ok with overriding it in that case.

I know that many people will consider that a “bad practice”, but I seriously don’t think that giving up readability for the sake of satisfying a rule that doesn’t make a lot of sense in this context is worth it.

3 Likes

Regardless of discussions about the meaningfulness of reserving the symbol “input”, I always have a very bad feeling whenever I see “input” overwritten in a python program. Quite often this results in meaningless error messages when, for instance, a undefined variable error should’ve been thrown.
Also, built-in linters in text editors tend to not highlight “input” as undefined or as unused.

Well, no one said it is a convention. We don’t encourage any particular naming convention when writing your own models, so feel free to use any other name if you find it better/safer. I think it mostly doesn’t matter, since the model logic tends to be quite simple, and any usage of input will likely immediately raise a TypeError. But as I said, we don’t plan to establish any conventions for naming arguments.

Sure, I was just suggesting that not overwriting reserved symbols will bring an increase in the quality of the examples (however small that increase is).

If someone sends a PR I’ll accept it, but it’s not a priority for us now, so we’ll do it later otherwise.