tensorflow 加载TF1协议缓冲区在TF版本2.2.0及更高版本中无法工作

myss37ts  于 6个月前  发布在  其他
关注(0)|答案(3)|浏览(54)

系统信息

  • 是否编写了自定义代码(与在TensorFlow中使用的库存示例脚本相反):是的
  • OS平台和发行版(例如,Linux Ubuntu 16.04):Linux 4.19.112+Windows 10
  • TensorFlow版本(使用以下命令):v2.2.0-rc4-8-g2b96f3662b 2.2.0v2.3.0-0-gb36436b087 2.3.0
  • Python版本:3.6.9
  • CUDA/cuDNN版本:CUDA禁用
  • GPU型号和内存:GPU禁用
    描述当前行为

使用tf.saved_model.simple_save保存的用Tensorflow 1训练的模型无法使用tf.saved_model.load加载到Tensorflow 2.2.0及更高版本。这段代码在Tensorflow 2.1.1中运行得非常好。我在TF 2.2.0及以上版本中得到的错误是:

<ipython-input-3-fa86b40288e8> in <module>()
----> 1 model_loaded = tf.saved_model.load('tensorflow_model/')
      2 model_loaded = model_loaded.signatures['serving_default']

/usr/local/lib/python3.6/dist-packages/tensorflow/python/saved_model/load.py in load(export_dir, tags, options)
    601     ValueError: If `tags` don't match a MetaGraph in the SavedModel.
    602   """
--> 603   return load_internal(export_dir, tags, options)
    604 
    605 

/usr/local/lib/python3.6/dist-packages/tensorflow/python/saved_model/load.py in load_internal(export_dir, tags, options, loader_cls)
    647   else:
    648     with ops.init_scope():
--> 649       root = load_v1_in_v2.load(export_dir, tags)
    650       root.graph_debug_info = debug_info
    651   return root

/usr/local/lib/python3.6/dist-packages/tensorflow/python/saved_model/load_v1_in_v2.py in load(export_dir, tags)
    261   """Load a v1-style SavedModel as an object."""
    262   loader = _EagerSavedModelLoader(export_dir)
--> 263   return loader.load(tags=tags)

/usr/local/lib/python3.6/dist-packages/tensorflow/python/saved_model/load_v1_in_v2.py in load(self, tags)
    207     wrapped = wrap_function.wrap_function(
    208         functools.partial(self.load_graph, load_graph_returns, meta_graph_def),
--> 209         signature=[])
    210     saver, = load_graph_returns
    211     restore_from_saver = self._extract_saver_restore(wrapped, saver)

/usr/local/lib/python3.6/dist-packages/tensorflow/python/eager/wrap_function.py in wrap_function(fn, signature, name)
    626           signature=signature,
    627           add_control_dependencies=False,
--> 628           collections={}),
    629       variable_holder=holder,
    630       signature=signature)

/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/func_graph.py in func_graph_from_py_func(name, python_func, args, kwargs, signature, func_graph, autograph, autograph_options, add_control_dependencies, arg_names, op_return_value, collections, capture_by_value, override_flat_arg_shapes)
    984         _, original_func = tf_decorator.unwrap(python_func)
    985 
--> 986       func_outputs = python_func(*func_args, **func_kwargs)
    987 
    988       # invariant: `func_outputs` contains only Tensors, CompositeTensors,

/usr/local/lib/python3.6/dist-packages/tensorflow/python/eager/wrap_function.py in __call__(self, *args, **kwargs)
     85 
     86   def __call__(self, *args, **kwargs):
---> 87     return self.call_with_variable_creator_scope(self._fn)(*args, **kwargs)
     88 
     89   def call_with_variable_creator_scope(self, fn):

/usr/local/lib/python3.6/dist-packages/tensorflow/python/eager/wrap_function.py in wrapped(*args, **kwargs)
     91     def wrapped(*args, **kwargs):
     92       with variable_scope.variable_creator_scope(self.variable_creator_scope):
---> 93         return fn(*args, **kwargs)
     94 
     95     return wrapped

/usr/local/lib/python3.6/dist-packages/tensorflow/python/saved_model/load_v1_in_v2.py in load_graph(self, returns, meta_graph_def)
     88     # pylint: disable=protected-access
     89     saver, _ = tf_saver._import_meta_graph_with_return_elements(
---> 90         meta_graph_def)
     91     # pylint: enable=protected-access
     92     returns[0] = saver

/usr/local/lib/python3.6/dist-packages/tensorflow/python/training/saver.py in _import_meta_graph_with_return_elements(meta_graph_or_file, clear_devices, import_scope, return_elements, **kwargs)
   1484           import_scope=import_scope,
   1485           return_elements=return_elements,
-> 1486           **kwargs))
   1487 
   1488   saver = _create_saver_from_imported_meta_graph(meta_graph_def, import_scope,

/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/meta_graph.py in import_scoped_meta_graph_with_return_elements(meta_graph_or_file, clear_devices, graph, import_scope, input_map, unbound_inputs_col_name, restore_collections_predicate, return_elements)
    797         input_map=input_map,
    798         producer_op_list=producer_op_list,
--> 799         return_elements=return_elements)
    800 
    801     # TensorFlow versions before 1.9 (not inclusive) exported SavedModels

/usr/local/lib/python3.6/dist-packages/tensorflow/python/util/deprecation.py in new_func(*args, **kwargs)
    505                 'in a future version' if date is None else ('after %s' % date),
    506                 instructions)
--> 507       return func(*args, **kwargs)
    508 
    509     doc = _add_deprecated_arg_notice_to_docstring(

/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/importer.py in import_graph_def(***failed resolving arguments***)
    403       return_elements=return_elements,
    404       name=name,
--> 405       producer_op_list=producer_op_list)
    406 
    407 

/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/importer.py in _import_graph_def_internal(graph_def, input_map, return_elements, validate_colocation_constraints, name, producer_op_list)
    499       except errors.InvalidArgumentError as e:
    500         # Convert to ValueError for backwards compatibility.
--> 501         raise ValueError(str(e))
    502 
    503     # Create _DefinedFunctions for any imported functions.

ValueError: Node 'loss/gradients/model/batch_normalization_3/FusedBatchNormV3_1_grad/FusedBatchNormGradV3' has an _output_shapes attribute inconsistent with the GraphDef for output #3: Dimension 0 in both shapes must be equal, but are 0 and 64. Shapes are [0] and [64].

描述预期行为

无论TF2版本如何,模型都应该被加载。

独立代码以重现问题

Colab notebook to reproduce issue

其他信息/日志

尽管这应该足以重现问题,但在Stackoverflow question中创建的还有更多详细信息。

请尽可能在最新的Tensorflow版本中提供解决方法,因为我必须在受限于最新Tensorflow版本的环境中使用这个模型。

r6l8ljro

r6l8ljro1#

我已经在Colab中尝试过使用TF 2.1.1,没有遇到任何问题。我可以通过TF 2.3的nightly版本(2.4.0-dev20200906)重现这个问题。请查看gist here。谢谢!

h5qlskok

h5qlskok2#

我相信 this 提交破坏了与问题操作的向后兼容性。我已经将错误转交给相关维护者,但我无法估计他们何时能够查看。
在此期间,您提到您主要只需要一个解决方法。我对代码的理解是,有问题的输出实际上只是保留的占位符,实际上并没有使用。因此,我尝试直接修改您的模型以使输出大小兼容,它似乎可以加载。我会私下联系您,将修复后的模型发送给您。
如果您想自己进行修改,我已将失败节点( "loss/gradients/model/batch_normalization_3/FusedBatchNormV3_1_grad/FusedBatchNormGradV3" )的第四个输出(当零索引时为 3 )更改为从 64 更改为 0 的形状维度大小(节点上的确切路径为 attr["_output_shapes'][3].shape.dim.size )。
如果这不起作用,我们将不得不等待适当的修复。

fkaflof6

fkaflof63#

在Tensorflow 2.11中能够重现您的问题。请查看gist here。谢谢!

相关问题