[Bug] Error In Streaming Mode If Llm Call Tool Does Not Require Parameters
Description
When using the qwen3-32b model in streaming mode, an error occurs if the tool call does not require parameters. This issue is specific to the qwen3-32b model and does not occur when using the deepseek-chat API. The error is triggered when the model returns an empty argument, and the tool call is not properly formatted in JSON within the XML tags.
Steps to Reproduce
- Select the qwen3-32b model as the model to use.
- Set
stream=True
to enable streaming mode. - Select a tool that does not require parameters.
Agent Configuration (if applicable)
The following code snippet demonstrates the minimum implementation required to reproduce the issue:
import random
from agno.agent import Agent
from agno.models.openai.like import OpenAILike
from agno.tools import tool
@tool(show_result=True)
def get_number() -> str:
"""Get the number."""
# In a real implementation, this would call a number API
random_number = random.randint(1, 100)
return f"The number is {random_number}."
model = OpenAILike(
id="qwen3-32b", base_url="http://127.0.0.1:8080, api_key="sk-xxxxx"
)
agent = Agent(model=model, tools=[get_number], debug_mode=True)
agent.print_response("Give me a random number.", stream=True)
Expected Behavior
The tool should be called, and the results should be output.
Actual Behavior
The actual behavior is as follows:
PS D:\Code\agent\agno_bug> & D:/Code/agent/agno_bug/.venv/Scripts/python.exe d:/Code/agent/agno_bug/main.py
DEBUG ******************************************* Agent ID: d70c8231-afc4-4770-8ff7-812fc329179d *******************************************
DEBUG ****************************************** Session ID: 4a93145c-b17e-4f0a-aace-08780e3fe147 ******************************************
DEBUG *************************************** Agent Run Start: 4d03b13d-f073-4f03-b938-ffa1b1a242e9 ****************************************
DEBUG Processing tools for model
DEBUG Added tool get_number
DEBUG ---------------------------------------------------- OpenAI Response Stream Start ----------------------------------------------------
DEBUG ---------------------------------------------------------- Model: qwen3-32b ----------------------------------------------------------
DEBUG ================================================================ user ================================================================
DEBUG Give me a random number.
DEBUG messages in stream: [{'role': 'user', 'content': 'Give me a random number.'}]
DEBUG ============================================================= assistant ==============================================================
DEBUG <think>
Okay, the user is asking for a random number. Let me check the tools available. There's a function called get_number with no parameters required.
Since the user wants a random number, I should call that function. But wait, the function's description just says "Get the number." Does it generate
a random one or return a fixed value? The parameters are empty, so maybe it's designed to return a random number by default. I'll proceed to call
get_number without any arguments. I need to make sure the tool call is correctly formatted in JSON within the XML tags. Alright, that should do it.
</think>
DEBUG Tool Calls:
- ID: 'chatcmpl-tool-1077bda4db6941d1936130b9f9d57102'
Name: 'get_number'
DEBUG ************************************************************* METRICS **************************************************************
DEBUG * Tokens: input=139, output=144, total=283
DEBUG * Time: 3.3514s
DEBUG * Tokens per second: 42.9673 tokens/s
DEBUG * Time to first token: 0.9840s
DEBUG ************************************************************* METRICS **************************************************************
DEBUG Getting function get_number
DEBUG Running: get_number()
DEBUG ================================================================ tool ================================================================
DEBUG Tool call Id: chatcmpl-tool-1077bda4db6941d1936130b9f9d57102
DEBUG The number is 67.
DEBUG *********************************************************** TOOL METRICS ***********************************************************
DEBUG * Time: 0.0022s
DEBUG *********************************************************** TOOL METRICS ***********************************************************
DEBUG messages in stream: [{'role': 'user', 'content': 'Give me a random number.'}, {'role': 'assistant', 'content': '<think>\nOkay, the user is asking
for a random number. Let me check the tools available. There\'s a function called get_number with no parameters required. Since the user wants a
random number, I should call that function. But wait, the function\'s description just says "Get the number." Does it generate a random one or
return a fixed value? The parameters are empty, so maybe it\'s designed to return a random number by default. I\'ll proceed to call get_number
without any arguments. I need to make sure the tool call is correctly formatted in JSON within the XML tags. Alright, that should do
it.\n</think>\n\n', 'tool_calls': [{'id': 'chatcmpl-tool-1077bda4db6941d1936130b9f9d57102', 'type': 'function', 'function': {'name': 'get_number',
'arguments': ''}}]}, {'role': 'tool', 'content': 'The number is 67.', 'tool_call_id': 'chatcmpl-tool-1077bda4db6941d1936130b9f9d57102'}]
ERROR API status error from OpenAI API: Error code: 400 - {'object': 'error', 'message': 'Expecting value: line 1 column 1 (char 0)', 'type':
'BadRequestError', 'param': None, 'code': 400}
▰▱▱▱▱▱▱ Thinking...
┏━ Message ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓
┃ ┃
┃ Give me a random number. ┃
┃ ┃
┗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┛
┏━ Tool Calls ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓
┃ ┃
┃ • get_number() ┃
┃ ┃
┗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┛
┏━ Response (3.4s) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓
┃ ┃
┃ <think> ┃
┃ Okay, the user is asking for a random number. Let me check the tools available. There's a function called get_number with no parameters required. ┃
┃ Since the user wants a random number, I should call that function. But wait, the function's description just says "Get the number." Does it generate a ┃
┃ random one or return a fixed value? The parameters are empty, so maybe it's designed to return a random number by default. I'll proceed to call ┃
┃ get_number without any arguments. I need to make sure the tool call is correctly formatted in JSON within the XML tags. Alright, that should do it. ┃
┃ </think> ┃
┃ ┃
┃ The number is 67. ┃
┃ ┃
┗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┛
Traceback (most recent call last):
File "D:\Code\agent\agno_bug\.venv\Lib\site-packages\agno\models\openai\chat.py", line 455, in invoke_stream
yield from self.get_client().chat.completions.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Code\agent\agno_bug\.venv\Lib\site-packages\openai\_utils\_utils.py", line 287, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "D:\Code\agent\agno_bug\.venv\Lib\site-packages\openai\resources\chat\completions\completions.py", line 925, in create
return self._post(
^^^^^^^^^^^
File "D:\Code\agent\agno_bug\.venv\Lib\site-packages\openai\_base_client.py", line 1239, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Code\agent\agno_bug\.venv\Lib\site-packages\openai\_base_client.py", line 1034, in request
raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'object': 'error', 'message': 'Expecting value: line 1 column 1 (char 0)', 'type': 'BadRequestError', 'param': None, 'code': 400}
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "d:\Code\agent\agno_bug\main.py", line 24, in <module>
agent.print_response("Give me a random number.", stream=True)
<br/>
**Q&A: Bug in Streaming Mode if LLM Call Tool Does Not Require Parameters**
**Q: What is the bug in streaming mode if LLM call tool does not require parameters?**
A: The bug occurs when using the qwen3-32b model in streaming mode and calling a tool that does not require parameters. The model returns an empty argument, and the tool call is not properly formatted in JSON within the XML tags, resulting in an API status error from OpenAI API.
**Q: What are the steps to reproduce the bug?**
A: To reproduce the bug, follow these steps:
1. Select the qwen3-32b model as the model to use.
2. Set `stream=True` to enable streaming mode.
3. Select a tool that does not require parameters.
**Q: What is the expected behavior?**
A: The expected behavior is that the tool should be called, and the results should be output.
**Q: What is the actual behavior?**
A: The actual behavior is that the tool call is not properly formatted in JSON within the XML tags, resulting in an API status error from OpenAI API.
**Q: What are the possible solutions to the bug?**
A: One possible solution is to add code to the `_format_messages` function in `agno.model.opnai.chat.OpenAIChat` to handle the case where the tool call does not require parameters.
**Q: What is the additional context for the bug?**
A: A similar problem arises in pydantic-ai, and they think it should be fixed in the agent framework.
**Q: What are the environment details for the bug?**
A: The environment details are:
* OS: Windows 11
* Agno Version: v1.4.5
* External Dependency Versions: openai-v1.78.0
* Additional Environment Details: Python 3.12.8
**Q: What are the screenshots or logs for the bug?**
A: The screenshots or logs are not provided in this article.
**Q: What are the possible workarounds for the bug?**
A: One possible workaround is to use the deepseek-chat API instead of the qwen3-32b model.
**Q: What are the next steps to resolve the bug?**
A: The next steps are to investigate the issue further, reproduce the bug, and apply the possible solutions to resolve the bug.
**Q: What are the benefits of resolving the bug?**
A: Resolving the bug will improve the stability and reliability of the agent framework, and it will also improve the user experience by providing accurate and reliable results.
**Q: What are the potential consequences of not resolving the bug?**
A: Not resolving the bug may result in continued errors and instability in the agent framework, which may lead to a negative user experience and potential loss of business.