CTranslate2 支持用于分类的BART模型

mpbci0fu  于 3个月前  发布在  其他
关注(0)|答案(5)|浏览(40)

你好,

我正在尝试将这个适应性转换为Bart Large MNLI: https://huggingface.co/joeddav/bart-large-mnli-yahoo-answers。它返回以下错误(但基本的Bart Large MNLI模型运行良好):

Traceback (most recent call last):
  File "/home/ubuntu/.local/bin/./ct2-transformers-converter", line 8, in <module>
    sys.exit(main())
  File "/home/ubuntu/.local/lib/python3.10/site-packages/ctranslate2/converters/transformers.py", line 445, in main
    converter.convert_from_args(args)
  File "/home/ubuntu/.local/lib/python3.10/site-packages/ctranslate2/converters/converter.py", line 50, in convert_from_args
    return self.convert(
  File "/home/ubuntu/.local/lib/python3.10/site-packages/ctranslate2/converters/converter.py", line 89, in convert
    model_spec = self._load()
  File "/home/ubuntu/.local/lib/python3.10/site-packages/ctranslate2/converters/transformers.py", line 62, in _load
    return loader(self._model_name_or_path)
  File "/home/ubuntu/.local/lib/python3.10/site-packages/ctranslate2/converters/transformers.py", line 85, in __call__
    spec = self.get_model_spec(model)
  File "/home/ubuntu/.local/lib/python3.10/site-packages/ctranslate2/converters/transformers.py", line 146, in get_model_spec
    pre_norm=model.config.normalize_before,
  File "/home/ubuntu/.local/lib/python3.10/site-packages/transformers/configuration_utils.py", line 257, in __getattribute__
    return super().__getattribute__(key)
AttributeError: 'BartConfig' object has no attribute 'normalize_before'

提前感谢!

jmp7cifd

jmp7cifd1#

有很多不同的配置需要考虑。再次感谢您的报告。

iqih9akk

iqih9akk2#

实际上,这个模型并不完全受支持。它使用了架构BartForSequenceClassification,但我们目前不支持额外的分类头。

xam8gpfp

xam8gpfp3#

感谢调查@guillaumekln!

s4n0splo

s4n0splo4#

我支持这个观点。实现仅包含编码器和分类头的Bert类模型将是非常棒的。更具体地说,如果我们可以使用这样的预训练解析器:https://ufal.mff.cuni.cz/udpipe/2/models,这将使集成到管道中更容易。

vfh0ocws

vfh0ocws5#

这个问题是关于BART(序列到序列模型)的。我为仅包含编码器的模型,如BERT创建了另一个问题:#1008

相关问题