Skip to content

Commit 7cefb15

Browse files
committed
Feat: Enables authentication to google via VertexAI and application default credentials
1 parent fb9b86d commit 7cefb15

File tree

4 files changed

+45
-9
lines changed

4 files changed

+45
-9
lines changed

.env.example

Lines changed: 8 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -20,8 +20,14 @@ OPENAI_BASE_URL="https://api.openai.com/v1"
2020

2121
# Example Google mapping:
2222
# PREFERRED_PROVIDER="google"
23-
# BIG_MODEL="gemini-2.5-pro-preview-03-25"
24-
# SMALL_MODEL="gemini-2.0-flash"
23+
# BIG_MODEL="gemini-2.5-pro"
24+
# SMALL_MODEL="gemini-2.5-flash"
25+
26+
# Example Google with vertex AI auth via ADC:
27+
# PREFERRED_PROVIDER="google"
28+
# USE_VERTEX_AUTH=true
29+
# BIG_MODEL="gemini-2.5-pro"
30+
# SMALL_MODEL="gemini-2.5-flash"
2531

2632
# Example "just an Anthropic proxy" mode:
2733
# PREFERRED_PROVIDER="anthropic"

README.md

Lines changed: 19 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,7 @@ A proxy server that lets you use Anthropic clients with Gemini, OpenAI, or Anthr
1313

1414
- OpenAI API key 🔑
1515
- Google AI Studio (Gemini) API key (if using Google provider) 🔑
16+
- Google Cloud Project with Vertex AI API enabled (if using Application Default Credentials for Gemini) ☁️
1617
- [uv](https://github.com/astral-sh/uv) installed.
1718

1819
### Setup 🛠️
@@ -40,7 +41,10 @@ A proxy server that lets you use Anthropic clients with Gemini, OpenAI, or Anthr
4041

4142
* `ANTHROPIC_API_KEY`: (Optional) Needed only if proxying *to* Anthropic models.
4243
* `OPENAI_API_KEY`: Your OpenAI API key (Required if using the default OpenAI preference or as fallback).
43-
* `GEMINI_API_KEY`: Your Google AI Studio (Gemini) API key (Required if PREFERRED_PROVIDER=google).
44+
* `GEMINI_API_KEY`: Your Google AI Studio (Gemini) API key (Required if `PREFERRED_PROVIDER=google` and `USE_VERTEX_AUTH=true`).
45+
* `USE_VERTEX_AUTH` (Optional): Set to `true` to use Application Default Credentials (ADC) will be used (no static API key required). Note: when USE_VERTEX_AUTH=true, you must configure `VERTEX_PROJECT` and `VERTEX_LOCATION`.
46+
* `VERTEX_PROJECT` (Optional): Your Google Cloud Project ID (Required if `PREFERRED_PROVIDER=google` and `USE_VERTEX_AUTH=true`).
47+
* `VERTEX_LOCATION` (Optional): The Google Cloud region for Vertex AI (e.g., `us-central1`) (Required if `PREFERRED_PROVIDER=google` and `USE_VERTEX_AUTH=true`).
4448
* `PREFERRED_PROVIDER` (Optional): Set to `openai` (default), `google`, or `anthropic`. This determines the primary backend for mapping `haiku`/`sonnet`.
4549
* `BIG_MODEL` (Optional): The model to map `sonnet` requests to. Defaults to `gpt-4.1` (if `PREFERRED_PROVIDER=openai`) or `gemini-2.5-pro-preview-03-25`. Ignored when `PREFERRED_PROVIDER=anthropic`.
4650
* `SMALL_MODEL` (Optional): The model to map `haiku` requests to. Defaults to `gpt-4.1-mini` (if `PREFERRED_PROVIDER=openai`) or `gemini-2.0-flash`. Ignored when `PREFERRED_PROVIDER=anthropic`.
@@ -151,13 +155,24 @@ GEMINI_API_KEY="your-google-key" # Needed if PREFERRED_PROVIDER=google
151155
# SMALL_MODEL="gpt-4.1-mini" # Optional, it's the default
152156
```
153157

154-
**Example 2: Prefer Google**
158+
**Example 2a: Prefer Google (using GEMINI_API_KEY)**
155159
```dotenv
156160
GEMINI_API_KEY="your-google-key"
157161
OPENAI_API_KEY="your-openai-key" # Needed for fallback
158162
PREFERRED_PROVIDER="google"
159-
# BIG_MODEL="gemini-2.5-pro-preview-03-25" # Optional, it's the default for Google pref
160-
# SMALL_MODEL="gemini-2.0-flash" # Optional, it's the default for Google pref
163+
# BIG_MODEL="gemini-2.5-pro" # Optional, it's the default for Google pref
164+
# SMALL_MODEL="gemini-2.5-flash" # Optional, it's the default for Google pref
165+
```
166+
167+
**Example 2b: Prefer Google (using Vertex AI with Application Default Credentials)**
168+
```dotenv
169+
OPENAI_API_KEY="your-openai-key" # Needed for fallback
170+
PREFERRED_PROVIDER="google"
171+
VERTEX_PROJECT="your-gcp-project-id"
172+
VERTEX_LOCATION="us-central1"
173+
USE_VERTEX_AUTH=true
174+
# BIG_MODEL="gemini-2.5-pro" # Optional, it's the default for Google pref
175+
# SMALL_MODEL="gemini-2.5-flash" # Optional, it's the default for Google pref
161176
```
162177

163178
**Example 3: Use Direct Anthropic ("Just an Anthropic Proxy" Mode)**

pyproject.toml

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,9 @@ dependencies = [
99
"uvicorn>=0.34.0",
1010
"httpx>=0.25.0",
1111
"pydantic>=2.0.0",
12-
"litellm>=1.40.14",
12+
"litellm>=1.77.7",
1313
"python-dotenv>=1.0.0",
14+
"google-auth>=2.41.1",
15+
"google-cloud-aiplatform>=1.120.0",
1416
]
1517

server.py

Lines changed: 15 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -82,6 +82,13 @@ def format(self, record):
8282
OPENAI_API_KEY = os.environ.get("OPENAI_API_KEY")
8383
GEMINI_API_KEY = os.environ.get("GEMINI_API_KEY")
8484

85+
# Get Vertex AI project and location from environment (if set)
86+
VERTEX_PROJECT = os.environ.get("VERTEX_PROJECT", "unset")
87+
VERTEX_LOCATION = os.environ.get("VERTEX_LOCATION", "unset")
88+
89+
# Option to use Gemini API key instead of ADC for Vertex AI
90+
USE_VERTEX_AUTH = os.environ.get("USE_VERTEX_AUTH", "False").lower() == "true"
91+
8592
# Get OpenAI base URL from environment (if set)
8693
OPENAI_BASE_URL = os.environ.get("OPENAI_BASE_URL")
8794

@@ -1125,8 +1132,14 @@ async def create_message(
11251132
else:
11261133
logger.debug(f"Using OpenAI API key for model: {request.model}")
11271134
elif request.model.startswith("gemini/"):
1128-
litellm_request["api_key"] = GEMINI_API_KEY
1129-
logger.debug(f"Using Gemini API key for model: {request.model}")
1135+
if USE_VERTEX_AUTH:
1136+
litellm_request["vertex_project"] = VERTEX_PROJECT
1137+
litellm_request["vertex_location"] = VERTEX_LOCATION
1138+
litellm_request["custom_llm_provider"] = "vertex_ai"
1139+
logger.debug(f"Using Gemini ADC with project={VERTEX_PROJECT}, location={VERTEX_LOCATION} and model: {request.model}")
1140+
else:
1141+
litellm_request["api_key"] = GEMINI_API_KEY
1142+
logger.debug(f"Using Gemini API key for model: {request.model}")
11301143
else:
11311144
litellm_request["api_key"] = ANTHROPIC_API_KEY
11321145
logger.debug(f"Using Anthropic API key for model: {request.model}")

0 commit comments

Comments
 (0)