如何使用另一个预训练的伯特模型和ktrain文本分类器?

2024-09-30 22:12:53 发布

您现在位置:Python中文网/ 问答频道 /正文

我们如何为ktrain库中的文本分类器使用不同的预训练模型?使用时:

model = text.text_classifier('bert', (x_train, y_train) , preproc=preproc)

This uses the multilangual pretrained model

然而,我也想尝试一个单语模型。即荷兰语:“wietsedv/bert base Dutch cased”,也用于其他k-train实现,for example

但是,当尝试在文本分类器中使用此命令时,它不起作用:

model = text.text_classifier('bert', (x_train, y_train) ,
> preproc=preproc, bert_model='wietsedv/bert-base-dutch-cased')

model = text.text_classifier('wietsedv/bert-base-dutch-cased', (x_train, y_train), preproc=preproc)

有人知道怎么做吗?谢谢


Tags: text模型文本basemodel分类器trainthis
1条回答
网友
1楼 · 发布于 2024-09-30 22:12:53

ktrain中有两个文本分类API。第一个是text_classifierAPI,可用于选择数量的变压器和非变压器模型。第二个是TransformerAPI,它可以与任何transformers模型一起使用,包括您列出的模型

后者在this tutorial notebookthis medium article中详细解释

例如,在下面的示例中,您可以将MODEL_NAME替换为您想要的任何模型:

例如:

# load text data
categories = ['alt.atheism', 'soc.religion.christian','comp.graphics', 'sci.med']
from sklearn.datasets import fetch_20newsgroups
train_b = fetch_20newsgroups(subset='train', categories=categories, shuffle=True)
test_b = fetch_20newsgroups(subset='test',categories=categories, shuffle=True)
(x_train, y_train) = (train_b.data, train_b.target)
(x_test, y_test) = (test_b.data, test_b.target)

# build, train, and validate model (Transformer is wrapper around transformers library)
import ktrain
from ktrain import text
MODEL_NAME = 'distilbert-base-uncased'  # replace this with model of choice
t = text.Transformer(MODEL_NAME, maxlen=500, class_names=train_b.target_names)
trn = t.preprocess_train(x_train, y_train)
val = t.preprocess_test(x_test, y_test)
model = t.get_classifier()
learner = ktrain.get_learner(model, train_data=trn, val_data=val, batch_size=6)
learner.fit_onecycle(5e-5, 4)
learner.validate(class_names=t.get_classes()) # class_names must be string values

# Output from learner.validate()
#                        precision    recall  f1-score   support
#
#           alt.atheism       0.92      0.93      0.93       319
#         comp.graphics       0.97      0.97      0.97       389
#               sci.med       0.97      0.95      0.96       396
#soc.religion.christian       0.96      0.96      0.96       398
#
#              accuracy                           0.96      1502
#             macro avg       0.95      0.96      0.95      1502
#          weighted avg       0.96      0.96      0.96      1502

相关问题 更多 >