尝试使用自定义推理脚本在AWS SageMaker上部署TensorFlow模型时遇到错误。脚本导入TensorFlow,但部署日志显示ModuleNotFoundError:没有名为“tensorflow”的模块错误。该模型使用SageMaker上的TensorFlow进行训练,并存储在S3存储桶中。该模型正在使用sagemaker.tensorflow.TensorFlowModel类进行部署。推理脚本使用以TensorFlow的SavedModel格式保存的模型。需要帮助来解决错误并成功部署模型以进行推理。
日志
当我将inference.py推送到GitHub时,它在检查点内,但它不在我的Sagemaker示例中。
[2023-07-10 19:28:48 +0000] [6012] [INFO] Reason: Worker failed to boot.
WARNING:__main__:unexpected gunicorn exit (status: 768). restarting.
INFO:__main__:gunicorn version info:
gunicorn (version 20.0.4)
INFO:__main__:started gunicorn (pid: 6019)
[2023-07-10 19:28:48 +0000] [6019] [INFO] Starting gunicorn 20.0.4
[2023-07-10 19:28:48 +0000] [6019] [INFO] Listening at: unix:/tmp/gunicorn.sock (6019)
[2023-07-10 19:28:48 +0000] [6019] [INFO] Using worker: gevent
[2023-07-10 19:28:48 +0000] [6021] [INFO] Booting worker with pid: 6021
INFO:python_service:Creating grpc channel for port: 13000
[2023-07-10 19:28:49 +0000] [6021] [ERROR] Exception in worker process
Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/gunicorn/arbiter.py", line 583, in spawn_worker
worker.init_process()
File "/usr/local/lib/python3.8/site-packages/gunicorn/workers/ggevent.py", line 162, in init_process
super().init_process()
File "/usr/local/lib/python3.8/site-packages/gunicorn/workers/base.py", line 119, in init_process
self.load_wsgi()
File "/usr/local/lib/python3.8/site-packages/gunicorn/workers/base.py", line 144, in load_wsgi
self.wsgi = self.app.wsgi()
File "/usr/local/lib/python3.8/site-packages/gunicorn/app/base.py", line 67, in wsgi
self.callable = self.load()
File "/usr/local/lib/python3.8/site-packages/gunicorn/app/wsgiapp.py", line 49, in load
return self.load_wsgiapp()
File "/usr/local/lib/python3.8/site-packages/gunicorn/app/wsgiapp.py", line 39, in load_wsgiapp
return util.import_app(self.app_uri)
File "/usr/local/lib/python3.8/site-packages/gunicorn/util.py", line 358, in import_app
mod = importlib.import_module(module)
File "/usr/local/lib/python3.8/importlib/__init__.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
File "<frozen importlib._bootstrap>", line 991, in _find_and_load
File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 783, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/sagemaker/python_service.py", line 409, in <module>
resources = ServiceResources()
File "/sagemaker/python_service.py", line 395, in __init__
self._python_service_resource = PythonServiceResource()
File "/sagemaker/python_service.py", line 82, in __init__
self._handler, self._input_handler, self._output_handler = self._import_handlers()
File "/sagemaker/python_service.py", line 283, in _import_handlers
spec.loader.exec_module(inference)
File "<frozen importlib._bootstrap_external>", line 783, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/opt/ml/model/code/inference.py", line 1, in <module>
import tensorflow as tf
ModuleNotFoundError: No module named 'tensorflow'
[2023-07-10 19:28:49 +0000] [6021] [INFO] Worker exiting (pid: 6021)
[2023-07-10 19:28:49 +0000] [6019] [INFO] Shutting down: Master
[2023-07-10 19:28:49 +0000] [6019] [INFO] Reason: Worker failed to boot.
WARNING:__main__:unexpected gunicorn exit (status: 768). restarting.
INFO:__main__:gunicorn version info:
gunicorn (version 20.0.4)
INFO:__main__:started gunicorn (pid: 6026)
[2023-07-10 19:28:49 +0000] [6026] [INFO] Starting gunicorn 20.0.4
[2023-07-10 19:28:49 +0000] [6026] [INFO] Listening at: unix:/tmp/gunicorn.sock (6026)
[2023-07-10 19:28:49 +0000] [6026] [INFO] Using worker: gevent
[2023-07-10 19:28:49 +0000] [6028] [INFO] Booting worker with pid: 6028
INFO:python_service:Creating grpc channel for port: 13000
字符串
Github链接:https://github.com/Its-suLav-D/Inventory-Monitoring-At-Distribution-Centers/tree/master
一切聊天GPT建议
1条答案
按热度按时间r3i60tvu1#
推理容器使用不包括tensorflow包的tensorflow服务。您可以使用requirements.txt文件来安装它,或者在导入tensorflow之前在代码中执行os.system(“pip install tensorflow”)。