How to reduce memory usage for large matrix calculations

A_ = torch.sigmoid(torch.matmul(x, x.t()))
x is the feature of tens of thousands of nodes, the shape is 700,000*8, 8 is the number of features extracted from each node.
Calculation requires several t of memory. How to reduce memory overhead?

1 Like

One option would be to use Python Generator, I will leave a tutorial below if it can be useful for your application.

Optimizing Memory Usage in Python

I don’t understand how a generator would help here.

@bowensuuu your output matrix will have a shape of [700,000 * 8, 700,000 * 8] and will thus use ~114TB of memory in float32:

x = torch.randn(7*8, 8)
A_ = torch.sigmoid(torch.matmul(x, x.t()))
# torch.Size([56, 56])

(700000*8 * 700000*8 * 4) / 1024**4
# 114 TB

(my example uses 7*8 just to illustrate the shapes)

You would need to reduce the features significantly.