Why F.dropout didn't dropout?

import torch
import torch.nn as nn
import torch.optim as optim
import torch.nn.functional as F

b = torch.rand(16, 20)
F.dropout(b, p=0.2)

probability of an element to be zeroed p, whatever value i set, The result didn’t have 0.What is the reason of it?

It’s weird.Can someone tell me why?Thanks.

It is default with training=False. Set it to true http://pytorch.org/docs/master/nn.html#torch.nn.functional.dropout

1 Like