Closed
Description
Question
The following content is translated using translation tools, and there may be certain misunderstandings.
Problem
When I use qwen3-32b as the model, when calling tools without parameters in streaming mode, the model always returns argument as None, which causes pydantic-ai to always think that the returned ToolCallPart is incomplete, treating it as ToolCallPartDelta, and ultimately causing it to be unable to call the tool.
There are three conditions that trigger this scenario:
- Using the stream method of ModelRequestNode in the iter method.
- The tool used does not require parameters.
- The model itself lacks the ability to return a non-None argument.
Code
import asyncio
from pydantic_ai import Agent
from pydantic_ai.messages import (
PartDeltaEvent,
TextPartDelta,
ToolCallPartDelta,
)
from pydantic_ai.models.openai import OpenAIModel
from pydantic_ai.providers.openai import OpenAIProvider
model = OpenAIModel(
"qwen3-32b",
provider=OpenAIProvider(
api_key="sk-xxxxx",
base_url="http://127.0.0.1:8080",
),
)
weather_agent = Agent(
model,
# "deepseek:deepseek-chat",
system_prompt="You are a weather agent. Please use get_weather() to get the weather.",
)
@weather_agent.tool_plain
def get_weather() -> str:
return "24°C and sunny."
output_messages: list[str] = []
async def main():
user_prompt = "What will the weather be like?"
# Begin a node-by-node, streaming iteration
async with weather_agent.iter(user_prompt) as run:
async for node in run:
if Agent.is_model_request_node(node):
async with node.stream(run.ctx) as request_stream:
async for event in request_stream:
if isinstance(event, PartDeltaEvent):
if isinstance(event.delta, TextPartDelta):
print(event.delta.content_delta, end="", flush=True)
elif isinstance(event.delta, ToolCallPartDelta):
print(
f"[Request] Part {event.index} args_delta={event.delta.args_delta}"
)
if __name__ == "__main__":
asyncio.run(main())
Output
# use qwen3-32b
PS D:\Code\python\agent-demo> & D:/Code/python/agent-demo/.venv/Scripts/python.exe d:/Code/python/agent-demo/test.py
<think>
Okay, the user is asking about the weather. I need to use the get_weather function. Wait, the function parameters are empty. Hmm, maybe the function doesn't require any arguments. But how does it know where to get the weather for? Oh, maybe it's designed to use the user's location automatically. I should check the function description, but it's empty here. Well, the instructions say to use get_weather, so I'll call it without any parameters. Let's see, the tool call should be a JSON object with the name and arguments. Since there are no parameters, arguments will be an empty object. Alright, that should work.
</think>
# use deepseek-chat
PS D:\Code\python\agent-demo> & D:/Code/python/agent-demo/.venv/Scripts/python.exe d:/Code/python/agent-demo/test.py
[Request] Part 1 args_delta={}
The weather will be 24°C and sunny. Enjoy the pleasant day!
Question
My question is, can this be considered a BUG in the streaming mode of pydantic-ai, or should it be seen as a problem caused by insufficient model capabilities?
Additional Context
pydantic-ai: 0.1.3
python:3.12.8