bert 这个错误信息表示在从SavedModel中加载时,没有找到匹配的函数调用,得到的位置参数(总共2个):* False * None,关键字参数:{'as_dict': True, 'signature': 'tokenization_info'},

w41d8nur  于 2个月前  发布在  其他
关注(0)|答案(3)|浏览(37)

I am having this code to call Bert model and getting error:
File "/home/dr/Desktop/dali-md-master/biaffine_md.py", line 26, in add_model_specific_valuables
self.bert_tokenizer = self.load_bert_vocab()
File "/home/dr/Desktop/dali-md-master/biaffine_md.py", line 53, in load_bert_vocab
vocab_info = bert_model(signature="tokenization_info",as_dict=True)
File "/home/dr/anaconda3/envs/hcoref/lib/python2.7/site-packages/tensorflow_core/python/saved_model/load.py", line 438, in _call_attribute
return instance.call(*args, **kwargs)
File "/home/dr/anaconda3/envs/hcoref/lib/python2.7/site-packages/tensorflow_core/python/eager/def_function.py", line 449, in call
self._initialize(args, kwds, add_initializers_to=initializer_map)
File "/home/dr/anaconda3/envs/hcoref/lib/python2.7/site-packages/tensorflow_core/python/eager/def_function.py", line 392, in _initialize
*args, **kwds))
File "/home/dr/anaconda3/envs/hcoref/lib/python2.7/site-packages/tensorflow_core/python/eager/function.py", line 1847, in _get_concrete_function_internal_garbage_collected
graph_function, _, _ = self._maybe_define_function(args, kwargs)
File "/home/dr/anaconda3/envs/hcoref/lib/python2.7/site-packages/tensorflow_core/python/eager/function.py", line 2147, in _maybe_define_function
graph_function = self._create_graph_function(args, kwargs)
File "/home/dr/anaconda3/envs/hcoref/lib/python2.7/site-packages/tensorflow_core/python/eager/function.py", line 2038, in _create_graph_function
capture_by_value=self._capture_by_value),
File "/home/dr/anaconda3/envs/hcoref/lib/python2.7/site-packages/tensorflow_core/python/framework/func_graph.py", line 915, in func_graph_from_py_func
func_outputs = python_func(*func_args, **func_kwargs)
File "/home/dr/anaconda3/envs/hcoref/lib/python2.7/site-packages/tensorflow_core/python/eager/def_function.py", line 335, in wrapped_fn
return weak_wrapped_fn().wrapped(*args, **kwds)
File "/home/dr/anaconda3/envs/hcoref/lib/python2.7/site-packages/tensorflow_core/python/saved_model/function_deserialization.py", line 262, in restored_function_body
"\n\n".join(signature_descriptions)))
ValueError: Could not find matching function to call loaded from the SavedModel. Got:
Positional arguments (2 total):

  • False
  • None
    Keyword arguments: {'as_dict': True, 'signature': 'tokenization_info'}
    Expected these arguments to match one of the following 4 option(s):
    Option 1:
    Positional arguments (3 total):
  • {u'input_word_ids': TensorSpec(shape=(?, ?), dtype=tf.int32, name=u'input_word_ids'), u'input_mask': TensorSpec(shape=(?, ?), dtype=tf.int32, name=u'input_mask'), u'input_type_ids': TensorSpec(shape=(?, ?), dtype=tf.int32, name=u'input_type_ids')}
  • False
  • None
    Keyword arguments: {}
    Option 2:
    Positional arguments (3 total):
  • {u'input_word_ids': TensorSpec(shape=(?, ?), dtype=tf.int32, name=u'inputs/input_word_ids'), u'input_mask': TensorSpec(shape=(?, ?), dtype=tf.int32, name=u'inputs/input_mask'), u'input_type_ids': TensorSpec(shape=(?, ?), dtype=tf.int32, name=u'inputs/input_type_ids')}
  • False
  • None
    Keyword arguments: {}
    Option 3:
    Positional arguments (3 total):
  • {u'input_word_ids': TensorSpec(shape=(?, ?), dtype=tf.int32, name=u'inputs/input_word_ids'), u'input_mask': TensorSpec(shape=(?, ?), dtype=tf.int32, name=u'inputs/input_mask'), u'input_type_ids': TensorSpec(shape=(?, ?), dtype=tf.int32, name=u'inputs/input_type_ids')}
  • True
  • None
    Keyword arguments: {}
    Option 4:
    Positional arguments (3 total):
  • {u'input_word_ids': TensorSpec(shape=(?, ?), dtype=tf.int32, name=u'input_word_ids'), u'input_mask': TensorSpec(shape=(?, ?), dtype=tf.int32, name=u'input_mask'), u'input_type_ids': TensorSpec(shape=(?, ?), dtype=tf.int32, name=u'input_type_ids')}
  • True
  • None
    Keyword arguments: {}
    below is code:
    def load_bert_vocab(self):
    with tf.Graph().as_default():

bert_model = hub.Module(self.bert_url)

bert_model = hub.load(self.bert_url)
vocab_info = bert_model(signature="tokenization_info",as_dict=True)

with tf.Session() as sess:
    vocab_file, do_lower_case = sess.run([vocab_info["vocab_file"],vocab_info["do_lower_case"]])

return bert_tokenization.FullTokenizer(vocab_file=vocab_file,do_lower_case=do_lower_case)
toiithl6

toiithl61#

这个错误是由于在加载BERT模型时,找不到匹配的函数签名。根据提供的代码和错误信息,可以尝试以下解决方案:

  1. 确保self.bert_url是正确的BERT模型URL。
  2. 检查load_bert_vocab函数中的signature参数是否正确。在这个例子中,它应该是"tokenization_info"
  3. 如果问题仍然存在,可以尝试使用不同的BERT模型版本或从其他来源获取模型。

以下是修改后的load_bert_vocab函数:

def load_bert_vocab(self):
    with tf.Graph().as_default():
        bert_model = hub.Module(self.bert_url)
        vocab_info = bert_model(signature="tokenization_info", as_dict=True)

请确保将self.bert_url替换为实际的BERT模型URL。

tvmytwxo

tvmytwxo2#

是否有找到发生这种情况的解决方案?

wooyq4lh

wooyq4lh3#

这个错误是由于在加载BERT模型时,找不到匹配的函数签名。根据错误信息,有4种可能的选项,但都不符合预期。为了解决这个问题,你需要检查load_bert_vocab函数中的bert_model调用以及signature参数是否正确。

首先,确保你已经正确安装了tensorflow-hub库,并且self.bert_url指向正确的BERT模型URL。然后,你可以尝试以下修改:

def load_bert_vocab(self):
    with tf.Graph().as_default():
        bert_model = hub.Module(self.bert_url)
        bert_model.build()  # 确保模型已经构建完成
        vocab_info = bert_model(signature="tokenization_info", as_dict=True)

如果问题仍然存在,请检查你的BERT模型是否支持"tokenization_info"签名。你可以参考TensorFlow Hub文档了解如何使用不同的签名来获取BERT模型的信息。

相关问题