问题类型
Bug
你是否在TF nightly中复现了这个bug?
否
来源
source
Tensorflow版本
2.10
自定义代码
是
OS平台和发行版
Ubuntu 22.04
移动设备
- 无响应*
Python版本
3.10
Bazel版本
- 无响应*
GCC/编译器版本
- 无响应*
CUDA/cuDNN版本
- 无响应*
GPU型号和内存大小
- 无响应*
当前行为?
Attempting to compute the gradient, both as part of an autographed keras layer, or in eager mode, fails. This seems to happen when concatenating different slices of a ragged tensor.
重现问题的独立代码
https://colab.research.google.com/drive/1kteIaQeDouRH-DEEYYeGE9jiMtgcXUZm#scrollTo=MOhPQNf4JDBB
import tensorflow as tf
values = tf.constant([0, 1,2,3,4,5,6,7,8,9], tf.float32)
values = tf.reshape(values, [-1, 1])
r = tf.RaggedTensor.from_row_lengths(values, [0, 2, 2, 1, 0, 2, 3, 0, 0])
r = tf.RaggedTensor.from_uniform_row_length(r, 3)
r = tf.RaggedTensor.from_uniform_row_length(r, 3)
def crop(raggedImage: tf.RaggedTensor, top, bottom, left, right) -> tf.RaggedTensor:
'''
Crops a ragged tensor, removing 'pixels' from the boundary.
The input is interpreted as being b x h x w x s x c layout.
Only the sample dimension may be ragged.
'''
if bottom > 0:
bottom = -bottom
else:
bottom = None
if right > 0:
right = -right
else:
right = None
cropped = raggedImage[:, top : bottom, left : right, :, :]
return cropped
diameter = 3
with tf.GradientTape() as tape:
tape.watch(r)
l = []
for u in range(diameter):
for v in range(diameter):
l.append(crop(r, u, diameter-1-u, v, diameter-1-v))
s = tf.concat(l, axis=2)
tf.print(tape.gradient(s, [r]))
相关日志输出
Traceback (most recent call last):
File "venv-2.10/lib/python3.10/site-packages/tensorflow/python/eager/backprop.py", line 663, in _num_elements
shape_tuple = grad.values._shape_tuple() # pylint: disable=protected-access
AttributeError: 'IndexedSlices' object has no attribute '_shape_tuple'
3条答案
按热度按时间o2gm4chl1#
我能够在Ubuntu和colab中复现这个问题。请查看以下代码片段:
对于 TensorFlow v2.10,请使用以下代码:
对于 tf-nightly,请使用以下代码:
感谢您!
6psbrbz92#
感谢报告这个问题。
错误来自以下代码:
来源:
tensorflow/tensorflow/python/eager/backprop.py
第606行到第616行:
| | def_num_elements(grad): |
| | """The number of elements in the
grad
tensor.""" || | ifisinstance(grad, ops.Tensor): |
| | shape_tuple=grad._shape_tuple() # pylint: disable=protected-access |
| | elifisinstance(grad, indexed_slices.IndexedSlices): |
| | shape_tuple=grad.values._shape_tuple() # pylint: disable=protected-access |
| | else: |
| | raiseValueError("
grad
not a Tensor or IndexedSlices.") || | ifshape_tupleisNoneorNoneinshape_tuple: |
| | return0 |
| | returnfunctools.reduce(operator.mul, shape_tuple, 1) |
代码表示
grad
是indexed_slices.IndexedSlices
的倍数,而grad.values
应该根据 IndexedSlices API 返回一个Tensor
。如果报告的代码片段失败,那么上面的代码块中的以下代码也可能失败,需要检查。
可能需要进一步挖掘以找到根本原因。谢谢!
xzlaal3s3#
嗨,就我有限的对TensorFlow内部结构的了解而言,grad.values本身就是一个
IndexedSlices
类型的对象。我没有发现这是否是预期的(显然不是),以及它是如何产生的。