Skip to content

LiteLLM calls not working because of improper imports #1402

@shubhamgupta-dat

Description

@shubhamgupta-dat

Describe the bug

Function class cannot be imported from openai-python module as it is not present.

from openai.types.chat.chat_completion_message_tool_call import Function

Here the module from which Function class is being imported.

# File generated from our OpenAPI spec by Stainless. See CONTRIBUTING.md for details.

from typing import Union
from typing_extensions import Annotated, TypeAlias

from ..._utils import PropertyInfo
from .chat_completion_message_custom_tool_call import ChatCompletionMessageCustomToolCall
from .chat_completion_message_function_tool_call import ChatCompletionMessageFunctionToolCall

__all__ = ["ChatCompletionMessageToolCall"]

ChatCompletionMessageToolCall: TypeAlias = Annotated[
    Union[ChatCompletionMessageFunctionToolCall, ChatCompletionMessageCustomToolCall],
    PropertyInfo(discriminator="type"),
]

Debug information

  • Agents SDK version: v0.2.4
  • OpenAI-Python version: v1.99.1
  • Python version (e.g. Python 3.12)

Repro steps

Create venv with these files as import and use an LiteLLM Router.

Expected behavior

The import should not fail.

Possible fix

Import can be fixed:

from openai.types.chat.chat_completion_message_function_tool_call import Function

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions