Tensor multiplication along certain axis

Without going into details, the data of a tensor (whatever its shape is) is always stored internally as a kind of 1d array. The shape of the tensor is fixed by how is defined the mapping of indices to memory pointers that we call the memory layout. For instance, when you do b = a.view(some_shape), a and b will share the same data to avoid doing a costly copy, but may have different memory layout. In Pytorch, contiguous refer to some specific memory layout, so a tensor can be “contiguous” or “non contiguous”.
Contiguous tensor are usually more convenient as few operations won’t work with non contiguous tensor, view is one example of operation that won’t work on non contiguous tensor in input. The operation reshape has the same behavior than view except that it will work with non-contiguous data in witch case the data will be copied.

But you should not worry too much about the contiguous property of your tensor. Most of the time you won’t have any problems, and if an error is raised at some point because an operation was expecting a contiguous input tensor you can simply fix it by changing operation(tensor) to operation(tensor.contiguous()) to make it work.

So, in short when you want a reshaped tensor b from a:

  • If a is contiguous and you want a and b to share the same storage, use b = a.view(some_shape). So if you do an in-place operation afterward on a, b will also be modified.
  • If a is contiguous and you don’t want a and b to share the same storage, use b = a.view(some_shape).clone() or b = a.reshape(some_shape).clone().
  • If a is not contiguous, use b = a.reshape(some_shape) or a.contiguous().view(some_shape), a and b won’t share the same storage.
  • If a may be contiguous or non contiguous, use b = a.reshape(some_shape). But you cannot know if the storage of a and b will share the same data or not.
  • If a may be contiguous or non contiguous and you don’t want a and b to possibly share the same data use b = a.reshape(some_shape).clone()

For more information on what mean contiguous in pytorch, you can look to this thread. I invite you also to read this page doc about view.

I know the contiguous thing can be confusing, I hope this answer helped

1 Like