-
Notifications
You must be signed in to change notification settings - Fork 336
Azure OpenAI support #33
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR integrates Azure OpenAI support into the existing proxy, allowing requests prefixed with azure/ to route through Azure deployments.
- Introduces Azure environment variables and validation logic in
server.py - Extends request handling to set Azure credentials and endpoint in LiteLLM calls
- Adds two test scripts to verify direct and proxy-based Azure connectivity
Reviewed Changes
Copilot reviewed 4 out of 4 changed files in this pull request and generated 3 comments.
| File | Description |
|---|---|
| server.py | Added Azure config, updated model routing, and extended OpenAI logic to support Azure models |
| test_azure_proxy.py | New script to verify proxy routing for azure/ prefixed models |
| test_azure_openai.py | New script for direct Azure OpenAI connectivity via LiteLLM |
| .env.example | Added commented Azure configuration section |
Comments suppressed due to low confidence (1)
server.py:1695
- Variable
endpointis undefined; the function signature usespath. Changeendpointtopathor update the parameter name accordingly.
log_line = f"{Colors.BOLD}{method} {endpoint}{Colors.RESET} {status_str}"
|
@borisdev There are very many unrelated changes in the server.py - it looks like you've applied formatting rules etc. Could you revert all the unnecessary changes so this will reduce the merge conflicts with things like #39 please? I'd also like to more properly read the changes you've made to see if I've missed anything |
6926f7e to
6446168
Compare
- Add Azure OpenAI provider support with full configuration - Update model mapping to support azure/ prefix - Add Azure configuration examples to .env.example - Improve Azure test error messages and validation - Add proper Azure error handling and helpful config guidance 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <[email protected]>
fixed |
Summary
This PR adds Azure OpenAI as a provider - integration to the proxy server.
Key Features Added
Basic Azure Usage
Set Azure configuration in
.envfile:Use model format in requests:
azure/your-deployment-name