Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Help: can't run roulette_wheel with DeepSeek model #654

Open
ME-Msc opened this issue Jan 10, 2025 · 3 comments
Open

Help: can't run roulette_wheel with DeepSeek model #654

ME-Msc opened this issue Jan 10, 2025 · 3 comments

Comments

@ME-Msc
Copy link
Contributor

ME-Msc commented Jan 10, 2025

I am trying to run roulette_wheel with DeepSeek model.

I followed the doc.

The only code I add is

import os
import logfire
from dotenv import load_dotenv

load_dotenv()
logfire.configure()
deepseek_model = OpenAIModel(
    model_name="deepseek-chat",
    base_url="https://api.deepseek.com",
    api_key=os.getenv("OPENAI_API_KEY"),
)

And I changed the model of roulette_agent

roulette_agent = Agent(
    model=deepseek_model,
    deps_type=Deps,
    retries=3,
    result_type=bool,
    system_prompt=(
        "Use the `roulette_wheel` function to determine if the "
        "customer has won based on the number they bet on."
    ),
)

I met an error at async with roulette_agent.run_stream in main().

Expected delta with content, invalid chunk: ChatCompletionChunk(id='f97676fd-771c-44f7-9b2d-e528689fdbc9', choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, refusal=None, role=None, tool_calls=[ChoiceDeltaToolCall(index=0, id='call_0_a7ed393b-31cf-4857-bf10-2199d63240ff', function=ChoiceDeltaToolCallFunction(arguments='', name='roulette_wheel'), type='function')]), finish_reason=None, index=0, logprobs=None)], created=1736505898, model='deepseek-chat', object='chat.completion.chunk', service_tier=None, system_fingerprint='fp_3a5770e1b4', usage=None)

Could anyone tell me why it happened? Did I call the model in the wrong way?
I think @imfing may know, could you help me with a complete example of roulette_wheel?

@ME-Msc
Copy link
Contributor Author

ME-Msc commented Jan 10, 2025

The logfire shows like below.

image

image

@samihamine
Copy link

samihamine commented Jan 10, 2025

Hi @ME-Msc ,

I believe the error you encountered is related to the issue described in #149. It seems to arise in scenarios where the model is tasked with generating a chunk that includes both a function call and standard content. Specifically, in your case, the ChatCompletionChunk contains a tool_call (indicating a function call to roulette_wheel) but lacks accompanying content or a well-structured response.

At this stage, it appears that OpenAIStreamTextResponse class do not fully handle this particular case—when a chunk contains both a function_call and standard content. This behavior has been identified as a limitation in the current implementation.

From what I understand, there is an ongoing refactor of the class behavior (in this PR #468) that could address this issue and should potentially resolve similar problems where mixed outputs are expected in a single chunk.

@ME-Msc
Copy link
Contributor Author

ME-Msc commented Jan 11, 2025

@samihamine Thanks for your help! I believe many people have encountered this error, and I will continue to monitor this issue.

Additionally, I would like to suggest that the error message be raised more explicitly and clearly so that newcomers like me can quickly identify the problem.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants