URL
https://python.langchain.com/v0.2/docs/tutorials/rag/
待办事项清单
- 我为这个问题添加了一个非常描述性的标题。
- 如果适用,我包含了一个指向我参考的文档页面的链接。
当前文档的问题:
错误显示llm接收到的数据,它的数据类型没有属性:shape;
from langchain.chains import create_retrieval_chain
from langchain.chains.combine_documents import create_stuff_documents_chain
from langchain_core.prompts import ChatPromptTemplate
system_prompt = (
"You are an assistant for question-answering tasks. "
"Use the following pieces of retrieved context to answer "
"the question. If you don't know the answer, say that you "
"don't know. Use three sentences maximum and keep the "
"answer concise."
"\n\n"
"{context}"
)
prompt = ChatPromptTemplate.from_messages(
[
("system", system_prompt),
("human", "{input}"),
]
)
question_answer_chain = create_stuff_documents_chain(llm, prompt)
rag_chain = create_retrieval_chain(retriever, question_answer_chain)
response = rag_chain.invoke({"input": "What is Task Decomposition?"})
print(response["answer"])
就像这样
File "/home/desir/PycharmProjects/pdf_parse/rag/create_stuff_chain.py", line 28, in <module>
response = rag_chain.invoke({"input": "文章主旨"})
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/desir/soft/anaconda3/envs/pdf_parse/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 4573, in invoke
return self.bound.invoke(
^^^^^^^^^^^^^^^^^^
File "/home/desir/soft/anaconda3/envs/pdf_parse/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 2504, in invoke
input = step.invoke(input, config)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/desir/soft/anaconda3/envs/pdf_parse/lib/python3.11/site-packages/langchain_core/runnables/passthrough.py", line 469, in invoke
return self._call_with_config(self._invoke, input, config, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/desir/soft/anaconda3/envs/pdf_parse/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 1598, in _call_with_config
context.run(
File "/home/desir/soft/anaconda3/envs/pdf_parse/lib/python3.11/site-packages/langchain_core/runnables/config.py", line 380, in call_func_with_variable_args
return func(input, **kwargs) # type: ignore[call-arg]
^^^^^^^^^^^^^^^^^^^^^
File "/home/desir/soft/anaconda3/envs/pdf_parse/lib/python3.11/site-packages/langchain_core/runnables/passthrough.py", line 456, in _invoke
**self.mapper.invoke(
^^^^^^^^^^^^^^^^^^^
File "/home/desir/soft/anaconda3/envs/pdf_parse/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 3149, in invoke
output = {key: future.result() for key, future in zip(steps, futures)}
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/desir/soft/anaconda3/envs/pdf_parse/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 3149, in <dictcomp>
output = {key: future.result() for key, future in zip(steps, futures)}
^^^^^^^^^^^^^^^
File "/home/desir/soft/anaconda3/envs/pdf_parse/lib/python3.11/concurrent/futures/_base.py", line 456, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/home/desir/soft/anaconda3/envs/pdf_parse/lib/python3.11/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/home/desir/soft/anaconda3/envs/pdf_parse/lib/python3.11/concurrent/futures/thread.py", line 58, in run
result = self.fn(*self.args, **self.kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/desir/soft/anaconda3/envs/pdf_parse/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 4573, in invoke
return self.bound.invoke(
^^^^^^^^^^^^^^^^^^
File "/home/desir/soft/anaconda3/envs/pdf_parse/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 2504, in invoke
input = step.invoke(input, config)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/desir/soft/anaconda3/envs/pdf_parse/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 3976, in invoke
return self._call_with_config(
^^^^^^^^^^^^^^^^^^^^^^^
File "/home/desir/soft/anaconda3/envs/pdf_parse/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 1598, in _call_with_config
context.run(
File "/home/desir/soft/anaconda3/envs/pdf_parse/lib/python3.11/site-packages/langchain_core/runnables/config.py", line 380, in call_func_with_variable_args
return func(input, **kwargs) # type: ignore[call-arg]
^^^^^^^^^^^^^^^^^^^^^
File "/home/desir/soft/anaconda3/envs/pdf_parse/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 3844, in _invoke
output = call_func_with_variable_args(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/desir/soft/anaconda3/envs/pdf_parse/lib/python3.11/site-packages/langchain_core/runnables/config.py", line 380, in call_func_with_variable_args
return func(input, **kwargs) # type: ignore[call-arg]
^^^^^^^^^^^^^^^^^^^^^
File "/home/desir/soft/anaconda3/envs/pdf_parse/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1532, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/desir/soft/anaconda3/envs/pdf_parse/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1541, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/desir/soft/anaconda3/envs/pdf_parse/lib/python3.11/site-packages/transformers/models/mistral/modeling_mistral.py", line 1139, in forward
outputs = self.model(
^^^^^^^^^^^
File "/home/desir/soft/anaconda3/envs/pdf_parse/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1532, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/desir/soft/anaconda3/envs/pdf_parse/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1541, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/desir/soft/anaconda3/envs/pdf_parse/lib/python3.11/site-packages/transformers/models/mistral/modeling_mistral.py", line 937, in forward
batch_size, seq_length = input_ids.shape
^^^^^^^^^^^^^^^
AttributeError: 'ChatPromptValue' 对象没有属性 'shape'
我的模型是从本地磁盘加载的
import os
import time
import gc
import torch
print(torch.version.cuda)
gc.collect()
torch.cuda.empty_cache()
from transformers import AutoTokenizer, AutoModelForCausalLM
import torch
os.environ['HTTP_PROXY'] = 'http://127.0.0.1:7890'
os.environ['HTTPS_PROXY'] = 'http://127.0.0.1:7890'
cache_dir = os.path.expanduser("~/.mistral")
cache_mistral_tokenizer = AutoTokenizer.from_pretrained(pretrained_model_name_or_path=cache_dir)
cache_mistral_model = AutoModelForCausalLM.from_pretrained(cache_dir)
关于内容的想法或请求:
如何在输入给llm之前修改数据?
2条答案
按热度按时间h9vpoimq1#
请重新加载您的模型,并查看模型是否已正确安装在您的文件夹中。尝试直接向它提问,而不是使用链式调用,例如:
如果仍然出现相同的错误,那么我建议您从Hub重新下载模型。
gtlvzcf82#
请重新加载您的模型,并查看模型是否已正确安装在您的文件夹中。尝试直接向它提问,而不是使用链式命令,例如:
如果仍然出现相同的错误,那么我会建议您从Hub重新下载模型。
谢谢!它可以工作了!你真好!