GPTCache 在使用OpenAI模型生成嵌入时,出现了"openai.error.APIError: HTTP code 405 from API ( )"错误,

rqdpfwrv  于 24天前  发布在  其他
关注(0)|答案(2)|浏览(18)

我想要使用openai的text-embedding-3-small模型生成嵌入,以下是我的测试代码:

os.environ['OPENAI_API_KEY'] = 'my_api_key'
os.environ['OPENAI_API_BASE'] = 'my_api_base'

openai_embed_fnc = OpenAI('text-embedding-3-small', api_key=os.environ['OPENAI_API_KEY'])
vector_base = VectorBase('chromadb', dimension=1536, persist_directory='./.chroma')
data_manager = get_data_manager(CacheBase("sqlite"), vector_base=vector_base)
cache.init(
    embedding_func=openai_embed_fnc.to_embeddings,
    data_manager=data_manager,
    similarity_evaluation=SearchDistanceEvaluation(),
    )
cache.set_openai_key()

question = 'hi there'
response = openai.ChatCompletion.create(
    model='gpt-4o-mini',
    messages=[
        {
            'role': 'user',
            'content': question
        }
    ],
)
print(f'Answer: {response['choices'][0]['message']['content']}\n')

然而,这段代码给我带来了以下异常:

Traceback (most recent call last):
  File "/root/miniconda3/lib/python3.8/site-packages/openai/api_requestor.py", line 766, in _interpret_response_line
    data = json.loads(rbody)
  File "/root/miniconda3/lib/python3.8/json/__init__.py", line 357, in loads
    return _default_decoder.decode(s)
  File "/root/miniconda3/lib/python3.8/json/decoder.py", line 337, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/root/miniconda3/lib/python3.8/json/decoder.py", line 355, in raw_decode
    raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "test1.py", line 45, in <module>
    response = openai.ChatCompletion.create(
  File "/root/miniconda3/lib/python3.8/site-packages/gptcache/adapter/openai.py", line 125, in create
    return adapt(
  File "/root/miniconda3/lib/python3.8/site-packages/gptcache/adapter/adapter.py", line 78, in adapt
    embedding_data = time_cal(
  File "/root/miniconda3/lib/python3.8/site-packages/gptcache/utils/time.py", line 9, in inner
    res = func(*args, **kwargs)
  File "/root/miniconda3/lib/python3.8/site-packages/gptcache/embedding/openai.py", line 60, in to_embeddings
    sentence_embeddings = openai.Embedding.create(model=self.model, input=data, api_base=self.api_base)
  File "/root/miniconda3/lib/python3.8/site-packages/openai/api_resources/embedding.py", line 33, in create
    response = super().create(*args, **kwargs)
  File "/root/miniconda3/lib/python3.8/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 155, in create
    response, _, api_key = requestor.request(
  File "/root/miniconda3/lib/python3.8/site-packages/openai/api_requestor.py", line 299, in request
    resp, got_stream = self._interpret_response(result, stream)
  File "/root/miniconda3/lib/python3.8/site-packages/openai/api_requestor.py", line 710, in _interpret_response
    self._interpret_response_line(
  File "/root/miniconda3/lib/python3.8/site-packages/openai/api_requestor.py", line 768, in _interpret_response_line
    raise error.APIError(
openai.error.APIError: HTTP code 405 from API ()

我想这可能是openai的版本问题?因为我在另一个Python环境中使用相同的api_keyapi_base测试了嵌入API,它返回了正确的嵌入。这里是代码:

api_key = 'my_api_key'
base_url = 'my_api_base'
model_name = 'text-embedding-3-small'

client = OpenAI(
    api_key = api_key,
    base_url = base_url
)
response = client.embeddings.create(
    input='how are you',
    model=model_name
)
print(response.data[0].embedding)
cidc1ykv

cidc1ykv1#

看起来需要修改一下openai的嵌入接口。这部分的嵌入非常简单,你可以尝试自己实现一个gptcache的嵌入函数。

nx7onnlm

nx7onnlm2#

好的,我会尝试。谢谢你的回复。

相关问题