我正在尝试使用loadQAChain
与自定义提示符。创建链的代码如下所示:
import { OpenAI } from 'langchain/llms/openai';
import { PineconeStore } from 'langchain/vectorstores/pinecone';
import { LLMChain, loadQAChain, ChatVectorDBQAChain } from 'langchain/chains';
import { PromptTemplate } from 'langchain/prompts';
const CONDENSE_PROMPT =
PromptTemplate.fromTemplate(`Given the following conversation and a follow up question, rephrase the follow up question to be a standalone question.
Chat History:
{chat_history}
Follow Up Input: {question}
Standalone question:`);
const QA_PROMPT =
PromptTemplate.fromTemplate(`You are a helpful AI assistant. Use the following pieces of context to answer the question at the end.
If you don't know the answer, just say you don't know. DO NOT try to make up an answer.
If the question is not related to the context, politely respond that you are tuned to only answer questions that are related to the context.
{context}
Question: {question}
Helpful answer in markdown:`);
export const makeChain = (vectorstore: PineconeStore) => {
const questionGenerator = new LLMChain({
llm: new OpenAI({ temperature: 0 }),
prompt: CONDENSE_PROMPT,
});
const docChain = loadQAChain(
//change modelName to gpt-4 if you have access to it
new OpenAI({ temperature: 0, modelName: 'gpt-3.5-turbo' }),
{
prompt: QA_PROMPT,
}
);
return new ChatVectorDBQAChain({
vectorstore,
combineDocumentsChain: docChain,
questionGeneratorChain: questionGenerator,
returnSourceDocuments: true,
k: 4, //number of source documents to return. Change this figure as required.
});
};
每当我从API Route调用makeChain
时,Next.js中都会出现以下错误。
error Error: Invalid _type: undefined
at loadQAChain (webpack-internal:///(sc_server)/./node_modules/langchain/dist/chains/question_answering/load.js:31:11)
at makeChain (webpack-internal:///(sc_server)/./lib/makechain.ts:32:83)
at POST (webpack-internal:///(sc_server)/./app/api/chat/route.tsx:40:80)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async eval (webpack-internal:///(sc_server)/./node_modules/next/dist/server/future/route-modules/app-route/module.js:265:37)
错误只发生在我通过的时候
{
prompt: QA_PROMPT,
}
loadQAChain
。
我的package.json
看起来像这样:
{
"private": true,
"scripts": {
"dev": "prisma generate && next dev",
"server": "python manage.py runserver",
"build": "prisma generate && prisma db push && next build",
"start": "next start",
"lint": "next lint",
"ingest": "tsx -r dotenv/config scripts/ingest-data.ts"
},
"dependencies": {
"@microsoft/fetch-event-source": "^2.0.1",
"@pinecone-database/pinecone": "^0.1.6",
"@prisma/client": "^4.14.0",
"@radix-ui/react-accordion": "^1.1.2",
"@types/node": "^18.11.9",
"@types/react": "^18.0.25",
"bcrypt": "^5.1.0",
"clsx": "^1.2.1",
"dotenv": "^16.3.1",
"fs": "^0.0.1-security",
"langchain": "^0.0.82",
"lucide": "^0.246.0",
"lucide-react": "^0.246.0",
"next": "^13.4.2",
"next-auth": "^4.22.1",
"pdf-parse": "^1.1.1",
"pinecone": "^0.1.0",
"radix": "^0.0.0",
"react": "^18.2.0",
"react-dom": "^18.2.0",
"react-hot-toast": "^2.4.1",
"react-markdown": "^8.0.7",
"sanitize-filename": "^1.6.3",
"sanitize-html": "^2.10.0",
"sass": "^1.63.4",
"tailwind-merge": "^1.13.2"
},
"devDependencies": {
"@types/bcrypt": "^5.0.0",
"autoprefixer": "^10.4.4",
"eslint": "8.11.0",
"eslint-config-next": "^13.0.5",
"postcss": "^8.4.12",
"prisma": "^4.14.0",
"tailwindcss": "^3.0.23",
"typescript": "^4.6.2"
}
}
我的tsconfig.json
看起来像这样:
{
"compilerOptions": {
"target": "es5",
"lib": ["dom", "dom.iterable", "esnext"],
"allowJs": true,
"skipLibCheck": true,
"baseUrl": ".",
"paths": {
"@/components/*": ["components/*"],
"@/pages/*": ["pages/*"],
"@/app/*": ["app/*"],
"@/lib/*": ["lib/*"],
"@/styles/*": ["styles/*"],
"@/types/*": ["types/*"]
},
"strict": true,
"forceConsistentCasingInFileNames": true,
"noEmit": true,
"esModuleInterop": true,
"module": "esnext",
"moduleResolution": "node",
"resolveJsonModule": true,
"isolatedModules": true,
"jsx": "preserve",
"incremental": true,
"plugins": [
{
"name": "next"
}
]
},
"include": ["next-env.d.ts", "**/*.ts", "**/*.tsx", ".next/types/**/*.ts"],
"exclude": ["node_modules"]
}
我的OpenAI API Key和PineCone环境配置正确,我能够使用当前环境运行数据库摄取。有什么想法吗
我对langChain
不是很熟悉,所以解释起来相当有挑战性。我试着使用不同的链条,但我不能让它们工作。我尝试删除模板,错误消失了,当然,这没有帮助。
1条答案
按热度按时间yqyhoc1h1#
使用最新版本的Lanchain,不确定.82,但更新到.95。
几个指针:
ChatVectorDBQAChain
已弃用。使用ConversationalRetrievalQAChain
代替,如下所示:在
loadQAChain
中,他们现在强制检查链类型,这就是为什么你会得到错误,你需要显式地指定链类型,如下所示:这里的
onTokenStream
是令牌流回调处理函数。希望能帮上忙!