发生了什么?
当在AWS示例上托管时,前端的聊天和大脑API遇到了CORS问题。请建议一个解决方案,这非常重要。
相关的日志输出
Chat API Errors:
In Browser:
Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at http://3.7.*0.***:5050/chat/c15e9855-f86b-4a33-8724-95ad134963**/question/stream. (Reason: CORS header ‘Access-Control-Allow-Origin’ missing). Status code: 500.
Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at http://3.7.*0.***:5050/chat/3d90b29d-79e2-45fc-8350-cd64482ce8**/question/stream?brain_id=40ba47d7-51b2-4b2a-9247-89e29619efb0. (Reason: CORS request did not succeed). Status code: (null).
"Type Network Error" in frontend.
Brain API Errors:
In Browser:
Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at http://3.7.70.204:5050/brains/40ba47d7-51b2-4b2a-9247-89e29619efb0/. (Reason: CORS header ‘Access-Control-Allow-Origin’ missing). Status code: 500.
Axios error in frontend.
Twitter/LinkedIn详细信息
- 无响应*
3条答案
按热度按时间o2g1uqev1#
嘿,@gsrinivaskishan!我在这里协助你解决任何错误、问题或贡献。让我们一起解决这个问题。
要解决在AWS示例上托管Quivr时聊天和大脑创建/更新API的CORS问题,你需要在FastAPI应用程序中配置CORS设置。你可以使用
CORSMiddleware
从fastapi.middleware.cors
允许必要的源。以下是如何配置CORS设置的示例:
用实际应该被允许访问你的API的域名替换
origins
列表。这个配置将启用CORS并指定哪些源、方法和头是允许的1(https://github.com/QuivrHQ/quivr/blob/main/cms/quivr/config/middlewares.js)2(https://github.com/QuivrHQ/quivr/blob/main/backend/middlewares/cors.py)。hk8txs482#
@dosu
Still get the error after changing cors.py file for fast API.
Eg:
from fastapi.middleware.cors import CORSMiddleware
origins = [
"http://localhost",
"http://localhost:3000",
"http://localhost:3001",
"https://quivr.app",
"https://www.quivr.app",
"http://quivr.app",
"http://www.quivr.app",
"https://chat.quivr.app",
"http://localhost:5050",
"http://3.7.0.:3000",
"http://3.7.0.:5050",
"*",
]
def add_cors_middleware(app):
app.add_middleware(
CORSMiddleware,
allow_origins=origins,
allow_credentials=True,
...
How to skip this error?
Also getting this issue when using chat along with CORS error in browser:
File "/usr/local/lib/python3.11/site-packages/litellm/llms/openai.py", line 879, in async_streaming
backend-core | response = await openai_aclient.chat.completions.create(
How to enable only Local LLM like ollama for Quivr in AWS instance?
zazmityj3#
要解决CORS问题并在AWS示例中仅启用本地LLM(如Ollama)以供Quivr使用,请按照以下步骤操作:
解决CORS问题
cors.py
配置:确保
backend/middlewares/cors.py
中的origins
列表只包含您需要的特定来源。删除通配符"*"
以避免允许所有来源。确保您用于访问Quivr示例的域名包含在
origins
列表中。仅启用本地LLM(如Ollama)
从他们的website安装Ollama。然后运行以下命令以在后台运行Ollama:
移动迁移文件并重置数据库:
转到supabase,并在
user_settings
表中,将以下值添加到models
列中:如果您想将其设置为默认值,则可以使用以下命令删除整个表:
.env
文件:确保
.env
文件正确配置以激活Ollama。取消注解并将OLLAMA_API_BASE_URL
设置为Ollama API的本地URL:通过遵循这些步骤,您应该能够解决CORS问题并在您的AWS示例1(https://github.com/QuivrHQ/quivr/blob/main/backend/middlewares/cors.py)2(https://github.com/QuivrHQ/quivr/blob/main/docs/developers/contribution/llm/ollama.mdx)3(https://github.com/QuivrHQ/quivr/blob/main/.env.example)中启用Ollama LLM。