我正在尝试转换并运行一个小keras模型与tflite_runtime。转换到tflite工作和运行推理与tf.lite也工作得很好,但当使用解释器从tflite_runtime.interpreter我得到“分段错误:11”,没有其他错误消息。有什么想法如何解决?我需要这个运行没有tensorflow只与tflite运行时
我在macOS上使用:Python 3.8.5tensorflow 2.7.0 tflite_运行时2.5.0
该模型用于从一组界标中检测手部姿态:
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dropout (Dropout) (None, 42) 0
dense (Dense) (None, 50) 2150
dropout_1 (Dropout) (None, 50) 0
dense_1 (Dense) (None, 50) 2550
dense_2 (Dense) (None, 5) 255
=================================================================
Total params: 4,955
Trainable params: 4,955
Non-trainable params: 0
_________________________________________________________________
用于转换的代码:
saved_model_dir = './save_at_500.h5'
model = tf.keras.models.load_model(saved_model_dir)
df = pd.read_csv('./test_hand_data_2.csv')
gt = np.array([])
lmk = np.array([])
gt = np.append(gt, df['pose'].to_numpy()-1)
lmk = np.append(lmk, df.loc[:,'lx0':'ly20'].to_numpy())
lmk = np.reshape(lmk,(gt.shape[0],42))
def representative_dataset():
for data in tf.data.Dataset.from_tensor_slices((lmk)).batch(1).take(100):
yield [tf.dtypes.cast(data, tf.float32)]
converter = tf.lite.TFLiteConverter.from_keras_model(model)
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS]
converter.allow_custom_ops=True
converter.representative_dataset = representative_dataset
tflite_quant_model = converter.convert()
open( 'model.tflite' , 'wb' ).write(tflite_quant_model)
用于运行推理的代码:(请注意注解掉的代码可以正常工作)
import cv2
import numpy as np
import tflite_runtime.interpreter as tflite
#import tensorflow as tf
import pandas as pd
import os
from time import time
def main():
model_path = os.path.join(path,'models/6-10/model.tflite')
#interpreter = tf.lite.Interpreter(model_path) # this works!
interpreter = tflite.Interpreter(model_path=model_path) # segmentation fault here
interpreter.allocate_tensors()
input_details = interpreter.get_input_details()
output_details = interpreter.get_output_details()
print('INPUT\n', input_details)
print('\n OUTPUT\n',output_details)
lmk,gt = get_data()
input_data = [lmk[0]]
print(input_data)
interpreter.set_tensor(input_details[0]['index'], input_data)
# Execute the inference
t1=time()
interpreter.invoke()
t2=time()
output_data = interpreter.get_tensor(output_details[0]['index'])
print(output_data)
print('Inference time:',t2-t1,'s')
if __name__ == "__main__":
main()
谢谢你!
2条答案
按热度按时间qc6wkl3g1#
只需从网站TFLite-Interpreter中读取您需要创建的正确目标方法。
示例:我创建了一个测试,并通过www.example.com将其简单转换为TFLitemodel.save并读取其输入/输出,以证明保存和调用工作正常。
输出量:
8yoxcaq72#
我的树莓派4也有类似的问题。Pip安装了tflite-runtime v2.5.0,我一直在制作TF v2.9.1的模型。tflite_runtime的最新版本是2.10.0。
所以我在安装tflite_runtime时指定了它的版本。