Hi, In implementing class BertAdapter(nn.Module): , I am getting the following error:
File “/Users/ikram/opt/anaconda3/lib/python3.7/site-packages/torch/nn/modules/module.py”, line 576, in getattr
type(self).name, name))
AttributeError: ‘BertAdapter’ object has no attribute ‘module’
Is this due to the fact that I’m trying to run this on my CPU, and I’m not connected to a GPU?
Could you post the class definition of BertAdapter
, which raises this error, please?
The error is being thrown in line 106 of https://github.com/facebookresearch/EmpatheticDialogues/blob/master/retrieval_train.py “dtype = model.module.opt.dataset_name”
BertAdapter is defined in https://github.com/facebookresearch/EmpatheticDialogues/blob/master/empchat/bert_local.py
class BertAdapter(nn.Module):
def init(self, opt, dictionary):
from parlai.agents.bert_ranker.helpers import BertWrapper
try:
from pytorch_pretrained_bert import BertModel
except ImportError:
raise Exception(
"BERT rankers needs pytorch-pretrained-BERT installed. "
"\npip install pytorch-pretrained-bert"
)
super().__init__()
self.opt = opt
self.pad_idx = dictionary[PAD_TOKEN]
self.ctx_bert = BertWrapper(
bert_model=BertModel.from_pretrained(BERT_ID),
output_dim=opt.bert_dim,
add_transformer_layer=opt.bert_add_transformer_layer,
)
self.cand_bert = BertWrapper(
bert_model=BertModel.from_pretrained(BERT_ID),
output_dim=opt.bert_dim,
add_transformer_layer=opt.bert_add_transformer_layer,
)
…
Thanks for the line of code!
The .module
attribute is added, when you wrap your model into e.g. nn.DataParallel
as seen in this line of code.
However, based on the condition it seems this line will only be called, if you are using the GPU (or rather if opt_.cuda
is True
).
Is this the case or are you using the CPU?
In the latter case, just remove the .module
attribute and use dtype = model.opt.dataset_name
directly.
1 Like
Thank you so much! Your response was super helpful!