有6个标签,而不是2个,用于后续分类

2024-09-29 06:33:34 发布

您现在位置:Python中文网/ 问答频道 /正文

我只是想知道是否有可能将HuggingFaceBertForSequenceClassification模型扩展到2个以上的标签。文档说,我们可以传递位置参数,但“标签”似乎不起作用。有人有主意吗

模型赋值

labels = th.tensor([0,0,0,0,0,0], dtype=th.long).unsqueeze(0)
print(labels.shape)
modelBERTClass = transformers.BertForSequenceClassification.from_pretrained(
    'bert-base-uncased', 
    labels=labels
    )

l = [module for module in modelBERTClass.modules()]
l

控制台输出

torch.Size([1, 6])
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-122-fea9a36402a6> in <module>()
      3 modelBERTClass = transformers.BertForSequenceClassification.from_pretrained(
      4     'bert-base-uncased',
----> 5     labels=labels
      6     )
      7 

/usr/local/lib/python3.6/dist-packages/transformers/modeling_utils.py in from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
    653 
    654         # Instantiate model.
--> 655         model = cls(config, *model_args, **model_kwargs)
    656 
    657         if state_dict is None and not from_tf:

TypeError: __init__() got an unexpected keyword argument 'labels'

Tags: infrom模型baselabelsmodel标签bert
1条回答
网友
1楼 · 发布于 2024-09-29 06:33:34

当您使用.from_pretrained加载模型时,它将使用此配置的默认值。在bert-base-uncased的情况下,由于config.num_labels的值,您的模型只支持两个不同的标签:

from transformers import BertForSequenceClassification
model = BertForSequenceClassification.from_pretrained('bert-base-uncased')
model.parameters

输出:

...
  (classifier): Linear(in_features=768, out_features=2, bias=True)

您可以通过修改BertConfig来轻松修改此值:

from transformers import BertForSequenceClassification, BertConfig

config = BertConfig.from_pretrained('bert-base-uncased')
config.num_labels = 6
model = BertForSequenceClassification(config) 
model.parameters

输出:

...
 (classifier): Linear(in_features=768, out_features=6, bias=True)

相关问题 更多 >