我期待着建立一个管道,应用拥抱面对巴特模型一步一步。一旦我建立了管道,我将寻求用预先培训/预定义的编码器注意头替换编码器注意头
我希望实施的管道如下:
目前,我的代码看起来像下面的注释,我被卡住了
from transformers import AutoTokenizer, AutoModel, BartConfig, EncoderDecoderModel
article = """Text to be summarised."""
model_name = "facebook/bart-large-cnn"
# Values of dictionaries are tensors
attention_heads = {"Cars": cars_encoder_attention_layer,
"Countries": countries_encoder_attention_layer
}
model_name = "facebook/bart-large-cnn"
config = BartConfig.from_pretrained(model_name, output_hidden_states=True, output_attention=True)
tokenizer = AutoTokenizer.from_pretrained(model_name)
inputs = tokenizer(article, padding=True, truncation=True, return_tensors="pt")
model = AutoModel.from_pretrained(model_name)
model.config.output_attentions = True
outputs = model(**inputs)
# Overwrite the encoder attentions with the desired attention heads
outputs.encoder_attentions = attention_heads ["Cars"]
# Here I would take the overwritten encoder and insert into the decoder to generate the summary with the adjusted attention heads
目前没有回答
相关问题 更多 >
编程相关推荐