Bug描述
由于ollama客户端的冲突版本,无法同时安装llama-index-multi-modal-llms-ollama和llama-index-llms-ollama。
版本
0.11.1
重现步骤
尝试安装llama-index-embeddings-ollama~=0.3.0 && llama-index-multi-modal-llms-ollama~=0.2.0 && llama-index-llms-ollama 0.3.0
相关日志/回溯
77.79 INFO: pip is looking at multiple versions of llama-index-multi-modal-llms-ollama to determine which version is compatible with other requirements. This could take a while.
77.86 ERROR: Cannot install -r requirements.txt (line 69), -r requirements.txt (line 78) and -r requirements.txt (line 87) because these package versions have conflicting dependencies.
77.86
77.86 The conflict is caused by:
77.86 llama-index-embeddings-ollama 0.3.0 depends on ollama<0.4.0 and >=0.3.1
77.86 llama-index-llms-ollama 0.3.0 depends on ollama>=0.3.0
77.86 llama-index-multi-modal-llms-ollama 0.2.0 depends on ollama<0.2.0 and >=0.1.6
77.86
77.86 To fix this you could try to:
77.86 1. loosen the range of package versions you've specified
77.86 2. remove package versions to allow pip to attempt to solve the dependency conflict
77.86
77.88 ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-dependency-conflicts
------
failed to solve: process "/bin/sh -c apt-get remove -y python3-yaml && python3 -m pip install -r requirements.txt" did not complete successfully: exit code: 1
2条答案
按热度按时间mwg9r5ms1#
我找到了一个类似的开放问题:[Bug]: Conflicting versions。建议的解决方案是调整您的
pyproject.toml
文件中的依赖关系,以确保指定版本之间的兼容性。以下是一个示例配置:[tool.poetry.dependencies]
python = ">=3.8.1,<4.0"
llama-index-core = "^0.10.1"
torch = ">=2.1.2,<2.3.1" # Adjusted to avoid conflict with torch 2.3.1
pillow = ">=10.2.0,<10.4.0" # Adjusted to avoid conflict with Pillow 10.4.0
torchvision = "^0.17.0"
ftfy = "^6.1.3"
llama-index-multi-modal-llms-ollama = "~0.1.3"
llama-index-llms-ollama = "~0.2.2"
[tool.poetry.group.dev.dependencies]
ipython = "8.10.0"
jupyter = "^1.0.0"
mypy = "0.991"
pre-commit = "3.2.0"
pylint = "2.15.10"
pytest = "7.2.1"
pytest-mock = "3.11.1"
ruff = "0.0.292"
tree-sitter-languages = "^1.8.0"
types-Deprecated = ">=0.1.0"
types-PyYAML = "^6.0.12.12"
types-protobuf = "^4.24.0.4"
types-redis = "4.5.5.0"
types-requests = "2.28.11.8"
types-setuptools = "67.1.0.0"
[tool.poetry.group.dev.dependencies.black]
extras = ["jupyter"]
version = "<=23.9.1,>=23.7.0"
[tool.poetry.group.dev.dependencies.codespell]
extras = ["toml"]
version = ">=v2.2.6"
zf9nrax12#
llama-index-multi-modal-llms-ollama包需要内部更新,以支持ollama客户端0.3.0。