转tensorrt时报错:
input: kMAX dimensions in profile 0 are [2,3,128,128] but input has static dimensions [1,3,128,128]
原因1:
onnx导出是固定的bach_size
tensorrt是动态batch_size
导致报错了。
原因2,simplify时输入了固定的batch_size,导致的报错:
onnx_model = onnx.load("model2.onnx") # load onnx model
output_path = 'skip_simp.onnx'
model_simp, check = simplify(onnx_model, input_shapes={'input': [1, 4, 128, 128]})
解决方法:
导出onnx用动态batch_size:
data = torch.randn(1, 4, 128, 128)#.cuda()
model.eval()
torch.onnx.export(model, data, "model2.onnx", export_params=True, opset_version=11, do_constant_folding=True, # whether to execute constant folding for optimization
input_names=['input'], # the model's input names
output_names=['output'], dynamic_axes={'input': {0: 'batch_size'}, 'output': {0: 'batch_size'}})
# output_names=['output'], dynamic_axes={'input': {0: 'batch_size', 2: 'in_width', 3: 'int_height'}, 'output': {0: 'batch_size', 2: 'out_width', 3: 'out_height'}})
原因2的解决方法:
onnx_model = onnx.load("model2.onnx") # load onnx model
output_path = 'skip_simp2.onnx'
model_simp, check = simplify(onnx_model, dynamic_input_shape=True)
# model_simp, check = simplify(onnx_model)
assert check, "Simplified ONNX model could not be validated"
onnx.save(model_simp, output_path)
print('finished exporting onnx')
版权说明 : 本文为转载文章, 版权归原作者所有 版权申明
原文链接 : https://blog.csdn.net/jacke121/article/details/125902143
内容来源于网络,如有侵权,请联系作者删除!