A modern tech stack for building AI agents that combines the best of Python and TypeScript ecosystems. This example demonstrates how to connect Pydantic AI backends with Vercel AI SDK frontends through streaming-compatible APIs.
The UI above shows a simple Math Agent example that currently has one tool (sum).
- Pydantic AI - Type-safe AI agent framework with structured outputs
- FastAPI - Modern, fast web framework for building APIs
- uv - Blazing fast Python package manager
- React 19 - Modern UI library with hooks
- Vite - Lightning fast build tool
- Vercel AI SDK - Streaming AI interactions
- Bun - Fast package manager and runtime
The Honest Truth I love Pydantic AI, but it's only available in Python. I love the Vercel AI SDK, but it's only available in TypeScript. This forced me to bridge two worlds that don't naturally talk to each other.
What I Had to Hack Together
- Streaming compatibility between Pydantic AI and AI SDK
- Tool call handling across the language boundary
- Custom stream conversion to make everything work seamlessly
The Good Parts
- Both ecosystems have excellent hot reloading (FastAPI + Vite)
- uv and Bun are both fantastic package managers
The Trade-offs
- No shared type safety across the stack
- Having to maintain code in two languages
- Python and TypeScript don't natively communicate well
- More complexity than a single-language solution
Why This Repo Exists I built this example to show how to make these two amazing but separate ecosystems work together effectively, despite their incompatibilities.
Install the required tools for the best developer experience:
# Install Bun - wicked fast package manager and runtime
curl -fsSL https://bun.sh/install | bash# Install uv - Python package management as fast as Bun is for TypeScript
curl -LsSf https://astral.sh/uv/install.sh | sh# Navigate to backend directory
cd agent-backend
# Install dependencies with uv
uv sync
# Start development server with hot-reloading
fastapi dev main.pyThe backend will be available at http://localhost:8000
# Navigate to frontend directory
cd agent-frontend
# Install dependencies with Bun
bun install
# Start development server
bun run devThe frontend will be available at http://localhost:5173
├── agent-backend/ # Python FastAPI backend
│ ├── main.py # FastAPI app with Pydantic AI agent
│ ├── pyproject.toml # Python dependencies (uv)
│ └── uv.lock # Lockfile for reproducible builds
├── agent-frontend/ # React TypeScript frontend
│ ├── src/
│ │ ├── App.tsx # Main app component
│ │ ├── Chat.tsx # Chat interface with Vercel AI SDK
│ │ └── ...
│ ├── package.json # Node.js dependencies (Bun)
│ └── bun.lock # Lockfile for reproducible builds
└── README.md # This file
The backend converts Pydantic AI streams to Vercel AI SDK's Data Stream Protocol format, including tool calls and streaming text.
- Development server:
fastapi dev main.py - Production server:
fastapi run main.py - Install dependencies:
uv sync
- Development server:
bun run dev - Build:
bun run build - Lint:
bun run lint - Preview build:
bun run preview
Create .env file in agent-backend directory:
agent-backend/.env
GROQ_API_KEY=your_groq_key_hereThis example uses Groq Cloud with the Qwen3 32B model, but you can use any provider that Pydantic AI supports. Check out the full list of supported providers at https://ai.pydantic.dev/api/providers/.
Why Groq + Qwen3 32B? I chose this combination because it's cheap and decent at native tool calling.
Using a Different Provider? If you want to use a different provider:
- Update your environment variables to match the provider's API key name
- Modify the pydantic-settings configuration in the backend code
- Update the model name in your agent configuration
For example, to use OpenAI instead:
OPENAI_API_KEY=your_openai_key_here- Pydantic AI Documentation
- FastAPI Documentation
- Vercel AI SDK Documentation
- uv Documentation
- Bun Documentation
This example demonstrates the essential integration between Pydantic AI and Vercel AI SDK. Feel free to clone this repo and use it as a starting point for your own projects.
This project is licensed under the MIT License - see the LICENSE.md file for details.
