Skip to content

Conversation

@Aktsvigun
Copy link

Dear team,

Thank you for an amazing package. This MR adds Nebius AI Studio to the list of available providers.

@Aktsvigun
Copy link
Author

@markbackman @mattieruth Dear team, could you please review? This MR only adds new functionality (compatibility with Nebius AI Studio provider) to enable more usage of the framework. Thank you!

Copy link
Contributor

@markbackman markbackman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good! Thanks for contributing. Just a few comments.

Also, a few other to dos:

  • Update env.example with NEBIUS_API_KEY
  • Update the README with a link to the docs section
  • Add a docs entry following other LLM examples. Docs repo: https://github.com/pipecat-ai/docs
  • Lint the code. Install the pre-commit hook: uv run pre-commit install from the root of the repo. You can run scripts to clean up: ./scripts/fix-ruff.sh.

# SPDX-License-Identifier: BSD 2-Clause License
#

import sys
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Keep this file but remove all contents. The deprecation notice you see in others applies to older services only.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you remove the contents of this file? This should be blank.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This file should be blank.

@codecov
Copy link

codecov bot commented Aug 19, 2025

Codecov Report

❌ Patch coverage is 0% with 12 lines in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
src/pipecat/services/nebius/llm.py 0.00% 8 Missing ⚠️
src/pipecat/services/nebius/__init__.py 0.00% 4 Missing ⚠️
Files with missing lines Coverage Δ
src/pipecat/services/nebius/__init__.py 0.00% <0.00%> (ø)
src/pipecat/services/nebius/llm.py 0.00% <0.00%> (ø)
🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@Aktsvigun
Copy link
Author

Mark, thank you for an extensive review! Will fix everything once get back from vacation

@Aktsvigun
Copy link
Author

@markbackman Hey again! Could you please approve the MR?

@markbackman
Copy link
Contributor

Sorry for the delay. There are still a few to dos before this is ready.

@Aktsvigun
Copy link
Author

@markbackman thank you for the comments, fixed your suggestions. Could you please review?

@Aktsvigun
Copy link
Author

Hi @markbackman, kind ping here :)

</div></h1>

[![PyPI](https://img.shields.io/pypi/v/pipecat-ai)](https://pypi.org/project/pipecat-ai) ![Tests](https://github.com/pipecat-ai/pipecat/actions/workflows/tests.yaml/badge.svg) [![codecov](https://codecov.io/gh/pipecat-ai/pipecat/graph/badge.svg?token=LNVUIVO4Y9)](https://codecov.io/gh/pipecat-ai/pipecat) [![Docs](https://img.shields.io/badge/Documentation-blue)](https://docs.pipecat.ai) [![Discord](https://img.shields.io/discord/1239284677165056021)](https://discord.gg/pipecat) [![Ask DeepWiki](https://deepwiki.com/badge.svg)](https://deepwiki.com/pipecat-ai/pipecat)
[![PyPI](https://img.shields.io/pypi/v/pipecat-ai)](https://pypi.org/project/pipecat-ai) ![Tests](https://github.com/pipecat-ai/pipecat/actions/workflows/tests.yaml/badge.svg) [![codecov](https://codecov.io/gh/pipecat-ai/pipecat/graph/badge.svg?token=LNVUIVO4Y9)](https://codecov.io/gh/pipecat-ai/pipecat) [![Docs](https://img.shields.io/badge/Documentation-blue)](https://docs.pipecat.ai) [![Discord](https://img.shields.io/discord/1239284677165056021)](https://discord.gg/pipecat)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You probably need to rebase. The only change to the README should be to add Nebius to the LLM services list.

@@ -0,0 +1,51 @@
#
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Service implementation looks good!

audio_in_enabled=True,
audio_out_enabled=True,
vad_analyzer=SileroVADAnalyzer(),
),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

transport_params should be:

transport_params = {
    "daily": lambda: DailyParams(
        audio_in_enabled=True,
        audio_out_enabled=True,
        vad_analyzer=SileroVADAnalyzer(params=VADParams(stop_secs=0.2)),
        turn_analyzer=LocalSmartTurnAnalyzerV3(params=SmartTurnParams()),
    ),
    "twilio": lambda: FastAPIWebsocketParams(
        audio_in_enabled=True,
        audio_out_enabled=True,
        vad_analyzer=SileroVADAnalyzer(params=VADParams(stop_secs=0.2)),
        turn_analyzer=LocalSmartTurnAnalyzerV3(params=SmartTurnParams()),
    ),
    "webrtc": lambda: TransportParams(
        audio_in_enabled=True,
        audio_out_enabled=True,
        vad_analyzer=SileroVADAnalyzer(params=VADParams(stop_secs=0.2)),
        turn_analyzer=LocalSmartTurnAnalyzerV3(params=SmartTurnParams()),
    ),
}

That will require new imports:

from pipecat.audio.turn.smart_turn.base_smart_turn import SmartTurnParams
from pipecat.audio.turn.smart_turn.local_smart_turn_v3 import LocalSmartTurnAnalyzerV3
from pipecat.audio.vad.vad_analyzer import VADParams

},
]

context = OpenAILLMContext(messages, tools)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Update to match the new universal context pattern:

Suggested change
context = OpenAILLMContext(messages, tools)
context = LLMContext(messages, tools)

]

context = OpenAILLMContext(messages, tools)
context_aggregator = llm.create_context_aggregator(context)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And:

Suggested change
context_aggregator = llm.create_context_aggregator(context)
context_aggregator = LLMContextAggregatorPair(context)

async def on_client_connected(transport, client):
logger.info(f"Client connected")
# Kick off the conversation.
await task.queue_frames([context_aggregator.user().get_context_frame()])
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The new pattern for initiating the conversation is:

Suggested change
await task.queue_frames([context_aggregator.user().get_context_frame()])
await task.queue_frames([LLMRunFrame()])

@markbackman
Copy link
Contributor

On track to be merged. Just fix up those last items and add a changelog. Then this should be good to go!

@markbackman
Copy link
Contributor

Friendly ping to review these comments when you get a moment.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants