当收到OpenAI的text-davinci-003
模型的响应时,我能够使用以下PHP代码从响应中提取文本:
$response = $response->choices[0]->text;
以下是芬奇的回应代码:
{
"id": "cmpl-uqkvlQyYK7bGYrRHQ0eXlWi7",
"object": "text_completion",
"created": 1589478378,
"model": "text-davinci-003",
"choices": [
{
"text": "\n\nThis is indeed a test",
"index": 0,
"logprobs": null,
"finish_reason": "length"
}
],
"usage": {
"prompt_tokens": 5,
"completion_tokens": 7,
"total_tokens": 12
}
}
我现在尝试修改我的代码,以使用最近发布的gpt-3.5-turbo
模型,该模型返回的响应略有不同:
{
"id": "chatcmpl-123",
"object": "chat.completion",
"created": 1677652288,
"choices": [{
"index": 0,
"message": {
"role": "assistant",
"content": "\n\nHello there, how may I assist you today?",
},
"finish_reason": "stop"
}],
"usage": {
"prompt_tokens": 9,
"completion_tokens": 12,
"total_tokens": 21
}
}
我的问题是,如何修改代码:
$response = $response->choices[0]->text;
...这样它就可以抓取响应消息的内容?
1条答案
按热度按时间eh57zj3b1#
Python:
NodeJS:
PHP:
PHP工作示例
如果运行
test.php
,OpenAI API将返回以下完成:中文(简体)
英国的首都是伦敦。”
test.php