Skip to content

Conversation

@uesleilima
Copy link

@uesleilima uesleilima commented May 21, 2025

PR: Major Refactor – Native Async APIs, Dependency Upgrades, and Adapter Modernization

Overview

Implementation of Issue #45

The main driver for this change is to support native async/await throughout the API, greatly increasing flexibility, performance, and alignment with the latest patterns in the Langchain and FastAPI ecosystems.


Key Changes

1. Full Async Support for Agent Lifecycle and APIs

  • Introduced async-first APIs for agent lifecycle (acreate_agent, ainvoke, etc.), FastAPI routers, and key integration points.
  • Every FastAPI endpoint & router now natively supports async execution.
  • Agent factories, adapters, and model bridges leverage async execution, making it easier to work with streaming, real-time LLMs, and scalable deployments.

2. Dependency and Model Adapter Modernization

  • Upgraded dependencies to the latest major and minor versions for strong compatibility with modern LLM/client packages.
    • New minimums: langchain, openai, langchain-openai/antrhopic, fastapi etc.
  • Refactored model adapters for LlamaCpp, and more, using new import paths and proper async support.

3. Consistent Prompt Handling and Factory APIs

  • Updated react agent construction to use new parameter (from messages_modifier to prompt)

4. Streaming and Real-Time Features

  • Streaming support has been improved, including async iterators and SSE/streaming endpoint compatibility, which is essential for modern chat and LLM multi-turn applications.

5. Dependency Pinning & Poetry Lock Refresh

  • All dependencies in pyproject.toml and poetry.lock were reviewed and updated to ensure maximum compatibility with async frameworks and modern LLM model packages.
  • Superfluous or outdated development packages have been cleaned up, contributing to lighter and more reliable CI/CD and deployment.

6. Example Factories, Test Suites, and Documentation Updates

  • All examples and test agent factories have been updated for async signatures and new model/adapters.
  • In-code and README docs now detail how to leverage async and streaming patterns in downstream/codegen integrations.

Motivation

  • Adopt async-first patterns: Allows this backend and its users to work efficiently with concurrent, streaming, or long-running LLM tasks—a must for modern API usage and real-time AI applications.
  • Modernize for the latest Langchain/OpenAI/Anthropic/LLM adapters: Ensures out-of-the-box compatibility with the fastest-moving open-source LLM landscape.
  • Improve maintainability: Clearer separation of concerns, more explicit factory/adapter/subclass patterns, and tighter test/CI ensure future-proofing and easier onboarding for new contributors.

Migration/Breaking Changes

  • All agent and API lifecycle hooks are now async. Downstream consumers must use await and async FastAPI endpoints when customizing.
  • Prompt & agent factory signatures are now consistently async; see new examples for idiomatic usage.
  • Sync/legacy interfaces are not supported in this version. Migration is required for downstream consumers.

Compatibility

  • This PR is a major, breaking upgrade for consumers of the former synchronous-only codebase.
  • Strongly recommended for new projects, or those planning to run with modern Langchain/FastAPI/LLM stacks and requiring scalable, async/streaming support.

Thank you for considering this major upgrade—this PR sets up the project and its users for scalable, concurrent, and forward-looking LLM API deployments!

@benjaminvdb
Copy link

Great job! It would be wonderful to see this merged ❤️

@caffeinism
Copy link
Contributor

This is a much needed feature for me.

@samuelint
Copy link
Owner

@uesleilima please fix lint issues before I merge this feature.

@samuelint
Copy link
Owner

This PR is a duplicate of #52, which includes a few additional changes, so I’ll be merging that one instead.
@uesleilima — I believe @caffeinism built upon your work from this PR, so your contributions are definitely not lost 😊 Thanks a lot for your work!

@samuelint samuelint closed this Jul 5, 2025
@samuelint samuelint mentioned this pull request Jul 5, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants