Skip to content

Conversation

@borisdev
Copy link

@borisdev borisdev commented Jul 1, 2025

Summary

This PR adds Azure OpenAI as a provider - integration to the proxy server.

uv run uvicorn server:app --host 0.0.0.0 --port 8082 --reload
uv run python3 tests.py --azure-only

Key Features Added

  • Now people using OpenAI models via Azure can use this library. The code changes address the problem that Azure OpenAI requires more arguments in a request than OpenAI.

Basic Azure Usage

  1. Set Azure configuration in .env file:

    AZURE_OPENAI_ENDPOINT="https://your-resource.openai.azure.com"
    AZURE_OPENAI_API_KEY="your-azure-openai-api-key"
    AZURE_API_VERSION="2025-01-01-preview"
    AZURE_DEPLOYMENT_NAME="gpt-4"
  2. Use model format in requests: azure/your-deployment-name

Copilot AI review requested due to automatic review settings July 1, 2025 19:32
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR integrates Azure OpenAI support into the existing proxy, allowing requests prefixed with azure/ to route through Azure deployments.

  • Introduces Azure environment variables and validation logic in server.py
  • Extends request handling to set Azure credentials and endpoint in LiteLLM calls
  • Adds two test scripts to verify direct and proxy-based Azure connectivity

Reviewed Changes

Copilot reviewed 4 out of 4 changed files in this pull request and generated 3 comments.

File Description
server.py Added Azure config, updated model routing, and extended OpenAI logic to support Azure models
test_azure_proxy.py New script to verify proxy routing for azure/ prefixed models
test_azure_openai.py New script for direct Azure OpenAI connectivity via LiteLLM
.env.example Added commented Azure configuration section
Comments suppressed due to low confidence (1)

server.py:1695

  • Variable endpoint is undefined; the function signature uses path. Change endpoint to path or update the parameter name accordingly.
    log_line = f"{Colors.BOLD}{method} {endpoint}{Colors.RESET} {status_str}"

@borisdev borisdev changed the title Add comprehensive Azure OpenAI support Azure OpenAI support Jul 1, 2025
@borisdev
Copy link
Author

@1rgs

@dsyme
Copy link

dsyme commented Jul 17, 2025

@borisdev There are very many unrelated changes in the server.py - it looks like you've applied formatting rules etc.

Could you revert all the unnecessary changes so this will reduce the merge conflicts with things like #39 please? I'd also like to more properly read the changes you've made to see if I've missed anything

borisdev pushed a commit to borisdev/claude-code-proxy that referenced this pull request Aug 15, 2025
@borisdev borisdev force-pushed the main branch 11 times, most recently from 6926f7e to 6446168 Compare August 16, 2025 22:20
- Add Azure OpenAI provider support with full configuration
- Update model mapping to support azure/ prefix
- Add Azure configuration examples to .env.example
- Improve Azure test error messages and validation
- Add proper Azure error handling and helpful config guidance

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <[email protected]>
@borisdev
Copy link
Author

@borisdev There are very many unrelated changes in the server.py - it looks like you've applied formatting rules etc.

Could you revert all the unnecessary changes so this will reduce the merge conflicts with things like #39 please? I'd also like to more properly read the changes you've made to see if I've missed anything

fixed

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants