Hello All,
To avoid overlapping in multilabel text classification, I need to implement the OneVersusAll technique into a Bert finetuned Model.
This technique exists in SCkLearn lib as follow :
from sklearn.multiclass import OneVsOneClassifier
class UNERLinearModel(BertPreTrainedModel):
def __init__(self):
...
self.entity_classifier = OneVsOneClassifier(LinearSVC(random_state=0))
...
def _forward_train(self):
...
entity_clf = self.entity_classifier.fit(entity_repr.detach(), entity_types.detach()).predict(entity_repr.detach())
...
In this case, every time the method “fit” will be called, it erases all learned params and learns new ones using a small part of data. So, I cannot use it in my model.
I didn’t found anything similar in Pytorch. Any suggestion on how to implement the OneVersusOne (or OneVersusAll) technique?
Thanks in advance