llama_index [Bug]: BedrockConverse与流媒体和工具使用结合时无法正常工作

mwg9r5ms  于 23天前  发布在  其他
关注(0)|答案(3)|浏览(18)

错误描述

我尝试使用BedrockConverse从Anthropic Claude Sonnet 3.5获取响应流,但是遇到了一个错误:KeyError: 'toolUse'. It seems like something is not parsing correctly the tool use block. The error happens when I use llm.stream_chat_with_tools . It doesn't happen, for the same setup, when I use chat_with_tools。

版本信息

llama-index core: 0.10.67, llama-index-llms-bedrock-converse==0.1.6

重现步骤

from llama_index.core.tools import FunctionTool
from llama_index.llms.bedrock_converse import BedrockConverse

llm = BedrockConverse(
    model="anthropic.claude-3-5-sonnet-20240620-v1:0",
    region_name="eu-central-1",
)

def add(x: int, y: int) -> int:
    "Add two numbers."
    return x + y

add_tool = FunctionTool.from_defaults(add)

resp = llm.stream_chat_with_tools(tools=[add_tool], user_msg="How much is 2 + 3")
for token in resp:
    if token is not None:
        print(token.delta, sep="", flush=False, end="")

相关日志/回溯信息

KeyError                                  Traceback (most recent call last)
Cell In[113], line 16
     13 add_tool = FunctionTool.from_defaults(add)
     15 resp = llm.stream_chat_with_tools(tools=[add_tool], user_msg="How much is 2 + 3")
---> 16 for token in resp:
     17     if token is not None:
     18         print(token.delta, sep="", flush=False, end="")

File ~/.pyenv/versions/training-assistant/lib/python3.11/site-packages/llama_index/core/llms/callbacks.py:185, in llm_chat_callback.<locals>.wrap.<locals>.wrapped_llm_chat.<locals>.wrapped_gen()
    183 last_response = None
    184 try:
--> 185     for x in f_return_val:
    186         dispatcher.event(
    187             LLMChatInProgressEvent(
    188                 messages=messages,
   (...)
    191             )
    192         )
    193         yield cast(ChatResponse, x)

File ~/.pyenv/versions/training-assistant/lib/python3.11/site-packages/llama_index/llms/bedrock_converse/base.py:362, in BedrockConverse.stream_chat.<locals>.gen()
    348     yield ChatResponse(
    349         message=ChatMessage(
    350             role=role,
   (...)
    359         raw=response,
    360     )
    361 elif content_block_start := chunk.get("contentBlockStart"):
--> 362     tool_use = content_block_start["toolUse"]
    363     content = join_two_dicts(content, tool_use)
    364     (
    365         _,
    366         tool_calls,
    367         tool_call_ids,
    368         status,
    369     ) = self._get_content_and_tool_calls(content=content)

KeyError: 'toolUse'
jvidinwx

jvidinwx1#

嘿,@omrihar!我在这里帮助你解决任何错误、问题,甚至成为贡献者。让我们一起解决这些错误吧!🤖
当使用 llm.stream_chat_with_tools 时,KeyError: 'toolUse' 会发生,因为 BedrockConverse 中的 stream_chat 方法期望响应块中存在 toolUse 键。此键用于在流式传输过程中处理工具使用信息。
stream_chat 方法中,代码尝试访问 content_block_start["toolUse"]:

elif content_block_start := chunk.get("contentBlockStart"):
    tool_use = content_block_start["toolUse"]
    content = join_two_dicts(content, tool_use)
    (
        _,
        tool_calls,
        tool_call_ids,
        status,
    ) = self._get_content_and_tool_calls(content=content)

如果响应中缺少 toolUse 键,它将引发 KeyError
另一方面,chat_with_tools 方法不依赖于流式传输,一次处理整个响应,这可能不需要中间块中存在 toolUse 键。
要解决此问题,请确保在使用 llm.stream_chat_with_tools 时,来自 Anthropic Claude Sonnet 3.5 的响应包含 contentBlockStart 块中的 toolUse 键。Anthropic 类中的 llama-index 集成中的 ToolUseBlock 方法展示了如何处理 ToolUseBlock:

for r in response:
    if isinstance(r, ContentBlockDeltaEvent):
        if isinstance(r.delta, TextDelta):
            content_delta = r.delta.text
            content += content_delta
        else:
            if not isinstance(cur_tool_call, ToolUseBlock):
                raise ValueError("Tool call not started")
            content_delta = r.delta.partial_json
            cur_tool_json += content_delta
            try:
                argument_dict = parse_partial_json(cur_tool_json)
                cur_tool_call.input = argument_dict
            except ValueError:
                pass
    elif isinstance(r, ContentBlockStartEvent):
        if isinstance(r.content_block, ToolUseBlock):
            cur_tool_call = r.content_block
            cur_tool_json = ""
    elif isinstance(r, ContentBlockStopEvent):
        if isinstance(cur_tool_call, ToolUseBlock):
            cur_tool_calls.append(cur_tool_call)

此代码确保在流式传输过程中正确处理 ToolUseBlock,这应该防止 KeyError1(https://github.com/run-llama/llama_index/blob/main/llama-index-integrations/llms/llama-index-llms-bedrock-converse/llama_index/llms/bedrock_converse/base.py)2(https://github.com/run-llama/llama_index/blob/main/llama-index-integrations/llms/llama-index-llms-anthropic/llama_index/llms/anthropic/base.py)。

  • 要继续与 Dosu 交谈,请提及 @dosu。*
brjng4g3

brjng4g32#

我没有权限访问基岩来测试,但我希望能得到一个PR :)

bksxznpy

bksxznpy3#

@logan-markewich 我真的不知道如何开始解决这个问题,但在给出一些方向后,我可以尝试一下。

目前我甚至不明白使用流与工具调用的语法应该是什么(你是在从流中获取的响应对象上调用工具解析器吗?还是在最后一个标记上?)。就我所知,关于使用 stream_chat_with_tools ... 的文档完全不存在。

相关问题