Modify LLM_MODEL and the corresponding endpoint url and api_key in the .env file to the model you want, if it's not supported, you'll need to mock up and write some files yourself.
Modify LLM_MODEL and the corresponding endpoint url and api_key in the .env file to the model you want, if it's not supported, you'll need to mock up and write some files yourself. I understand now. By examining the implementation content of '/DB-GPT/dbgpt/storage/knowledge_graph/knowledge_graph.py', it is indeed necessary to customize some implementation files to support proxy LLM.
2条答案
按热度按时间zc0qhyus1#
Modify
LLM_MODEL
and the corresponding endpoint url and api_key in the.env
file to the model you want, if it's not supported, you'll need to mock up and write some files yourself.ykejflvf2#
Modify
LLM_MODEL
and the corresponding endpoint url and api_key in the.env
file to the model you want, if it's not supported, you'll need to mock up and write some files yourself.I understand now. By examining the implementation content of '/DB-GPT/dbgpt/storage/knowledge_graph/knowledge_graph.py', it is indeed necessary to customize some implementation files to support proxy LLM.