QAnything [BUG] 与奥拉玛·克文不兼容:14b

ecr0jaav  于 3个月前  发布在  其他
关注(0)|答案(2)|浏览(157)

是否已有关于该错误的issue或讨论?

  • 我已经搜索过已有的issues和讨论

该问题是否在FAQ中有解答?

  • 我已经搜索过FAQ

当前行为
integrate with ollama and qwen:14b got error

期望行为
Should work with ollama

运行环境

- OS: Win 11
- NVIDIA Driver: 551
- CUDA:12.4
- docker:26.1.1
- docker-compose:v2.27.0
- NVIDIA GPU: 2080ti
- NVIDIA GPU Memory: 22g

QAnything日志
[2024-05-20 10:47:20 +0800] [9399] [ERROR] Exception occurred while handling uri: ' http://172.17.22.174:8777/api/local_doc_qa/local_doc_chat '
Traceback (most recent call last):
File "handle_request", line 132, in handle_request
"_asgi_lifespan",
File "/opt/miniconda3/envs/qanything-python/lib/python3.10/site-packages/sanic/response/types.py", line 547, in stream
await self.streaming_fn(self)
File "/opt/QAnything/qanything_kernel/qanything_server/handler.py", line 398, in generate_answer
async for resp, next_history in local_doc_qa.get_knowledge_based_answer(custom_prompt=custom_prompt,
File "/opt/QAnything/qanything_kernel/core/local_doc_qa.py", line 267, in get_knowledge_based_answer
source_documents = self.reprocess_source_documents(query=query,
File "/opt/QAnything/qanything_kernel/core/local_doc_qa.py", line 192, in reprocess_source_documents
query_token_num = self.llm.num_tokens_from_messages([query])
File "/opt/QAnything/qanything_kernel/connector/llm/llm_for_openai_api.py", line 107, in num_tokens_from_messages
raise NotImplementedError(
NotImplementedError: num_tokens_from_messages() is not implemented for model qwen:14b. See https://github.com/openai/openai-python/blob/main/chatml.md for information on how messages are converted to tokens.

复现方法

  1. ollama run qwen:14b
  2. script:
    -b ' http://172.17.22.174:11434/v1 ' -k 'ollama' -n 'qwen:14b' -l '4096'
  3. run and chat with qanything

备注

  • No response*
ftf50wuq

ftf50wuq1#

我修改了qanything_kernel/connector/llm/llm_for_openai_api.py文件,使其返回默认令牌,然后可以与OpenAI一起运行。但新的问题是从知识库中无法检索到任何内容,完全由LLM自动回答。

zbsbpyhn

zbsbpyhn2#

我修改了qanything_kernel/connector/llm/llm_for_openai_api.py文件,使其返回默认令牌,然后可以与OpenAI一起运行。但新的问题是从知识库中无法检索到任何内容,完全由LLM自动回答。

我也遇到了同样的问题!

相关问题