python-3.x NCS AsyncInferQueue返回以前的结果,而不是特定推理的真实结果

igsr9ssn  于 2023-03-31  发布在  Python
关注(0)|答案(1)|浏览(109)

我们已经使用NCS 2很多个月了,最近发现了非常奇怪的行为。我已经包含了一个最小可复制程序的完整脚本。在此之前,虽然,这里是安装条件:

  • Raspberry Pi 4 B+,运行Raspbian GNU/Linux 11(牛眼)
  • python3 --version是Python 3.9.2
  • openvino build from 2022.1.1
    行为:

我们正在运行的代码获取一批n个图像,异步处理它们(我们发现通过这种方式运行最佳性能),然后返回批处理。参见下面的syn
我们期望得到16个不同的结果,但由于某种原因,我们似乎得到了图像索引mod异步推断队列的作业数的结果。对于下面的jobs=1的情况,所有图像的结果与第一个结果相同(但请注意:userdata是唯一的,因此asyncinferqueue为回调函数提供了一个唯一的userdata值)。

_temp_infer_queue = AsyncInferQueue(compiled_model, jobs=1)
AsyncInferenceResult = namedtuple("AsyncInferenceResult", ["id", "result"])

def syn(input_imgs, sort = False):
    res: List[AsyncInferenceResult] = []

    def _cb(
        infer_request: InferRequest, userdata: Any
    ) -> None:
        res.append(
            AsyncInferenceResult(
                id=userdata, result=infer_request.output_tensors[0].data[:]
                # also tried the following:
                # id=userdata, result=infer_request.get_output_tensor(0).data
            )
        )

    _temp_infer_queue.set_callback(_cb)

    for i, image in enumerate(input_imgs):
        tensor = np.expand_dims(image, (0, 3))
        # if all tensors were the same, their sum would be the same
        # easy way to verify that each image is unique
        print("TENSOR SUM", tensor.sum())
        _temp_infer_queue.start_async({0: tensor}, userdata=i)

    _temp_infer_queue.wait_all()

    for r1 in res:
        print(r1)

    print("---------------------------")
    if sort:
        return [r.result for r in sorted(res, key=op.attrgetter("id"))]
    return res

data = zarr.open("../../../allan/2023-03-03-135043__nomaxnoflowcontrol2.zip")

# yield_n will give n samples from an iterator - in this case,
# it will give [0,1,2,3], then [4,5,6,7], etc
for index_batch in yield_n(range(data.initialized), 4):
    images = [data[:, :, i] for i in index_batch]
    syn(images, sort=True)

预期效果:结果的唯一值,因为我们正在对唯一的图像运行推理

TENSOR SUM 181712885                                                   
TENSOR SUM 182752565                                                   
TENSOR SUM 182640761                                                   
TENSOR SUM 182361927                                                   
AsyncInferenceResult(id=0, result=array([[3.1972656]], dtype=float32)) 
AsyncInferenceResult(id=1, result=array([[2.3463234]], dtype=float32)) 
AsyncInferenceResult(id=2, result=array([[-1.345323]], dtype=float32)) 
AsyncInferenceResult(id=3, result=array([[3.0023452]], dtype=float32)) 
---------------------------                                            
TENSOR SUM 182579212                                                   
TENSOR SUM 182199813                                                   
TENSOR SUM 180750311                                                   
TENSOR SUM 180896550                                                   
AsyncInferenceResult(id=0, result=array([[1.2942656]], dtype=float32)) 
AsyncInferenceResult(id=1, result=array([[1.3351234]], dtype=float32)) 
AsyncInferenceResult(id=2, result=array([[2.3451223]], dtype=float32)) 
AsyncInferenceResult(id=3, result=array([[0.0345552]], dtype=float32))
---------------------------      
...etc

实际结果:推理的每一个结果都是相同的

TENSOR SUM 181712885                                                    
TENSOR SUM 182752565                                                    
TENSOR SUM 182640761                                                    
TENSOR SUM 182361927                                                    
AsyncInferenceResult(id=0, result=array([[3.1972656]], dtype=float32))  
AsyncInferenceResult(id=1, result=array([[3.1972656]], dtype=float32))  
AsyncInferenceResult(id=2, result=array([[3.1972656]], dtype=float32))  
AsyncInferenceResult(id=3, result=array([[3.1972656]], dtype=float32))  
---------------------------                                             
TENSOR SUM 182579212                                                    
TENSOR SUM 182199813                                                    
TENSOR SUM 180750311                                                    
TENSOR SUM 180896550                                                    
AsyncInferenceResult(id=0, result=array([[2.6289062]], dtype=float32))  
AsyncInferenceResult(id=1, result=array([[2.6289062]], dtype=float32))  
AsyncInferenceResult(id=2, result=array([[2.6289062]], dtype=float32))  
AsyncInferenceResult(id=3, result=array([[2.6289062]], dtype=float32))  
---------------------------     
...etc

当我们将AsyncInferQueue的作业数设置为2时,将重复相同的值(对作业数取模)

TENSOR SUM 181508284                                                    
TENSOR SUM 182244105                                                    
TENSOR SUM 181800558                                                    
TENSOR SUM 182178069                                                    
AsyncInferenceResult(id=0, result=array([[4.4921875]], dtype=float32))  
AsyncInferenceResult(id=1, result=array([[3.3867188]], dtype=float32))  
AsyncInferenceResult(id=2, result=array([[4.4921875]], dtype=float32))  
AsyncInferenceResult(id=3, result=array([[3.3867188]], dtype=float32))  
---------------------------                                             
TENSOR SUM 181820857                                                    
TENSOR SUM 181130636                                                    
TENSOR SUM 181852573                                                    
TENSOR SUM 181331641                                                    
AsyncInferenceResult(id=0, result=array([[2.3867188]], dtype=float32))  
AsyncInferenceResult(id=1, result=array([[2.9765625]], dtype=float32))  
AsyncInferenceResult(id=2, result=array([[2.3867188]], dtype=float32))  
AsyncInferenceResult(id=3, result=array([[2.9765625]], dtype=float32))  
---------------------------                                
...etc

那么到底是怎么回事?我做错了什么吗?我试图尽可能地遵循文档(尽管这并不容易,文档可能有点稀疏,搜索它们会得到旧版本的openvino,等等)。如果我在这里做错了什么,这似乎是一个容易陷入的陷阱?不应该在某个地方有一个响亮的失败吗?
我们已经使用NCS 2很多个月了,所以我们希望这是一个简单的修复。
让我知道什么需要澄清。我真的希望在这里得到一些帮助!
提前谢谢你!:)

5vf7fwbs

5vf7fwbs1#

问题源于Python演示脚本中的这部分代码:

def _cb(
    infer_request: InferRequest, userdata: Any
) -> None:
    res.append(
        AsyncInferenceResult(
            id=userdata, result=infer_request.output_tensors[0].data[:]
            # also tried the following:
            # id=userdata, result=infer_request.get_output_tensor(0).data
        )
    )

结果列表阅读的是整个输出Tensor的最终值,而不是每个输出Tensor的最终值。

相关问题