🚀 Your Free Personal AI Gateway

Aggregate multiple AI service providers, overcome rate limits, and maximize free quotas

Loading... Stars
Loading... Forks
Apex AI Proxy Illustration

Why Choose Apex AI Proxy?

🆓

Completely Free

Runs entirely on Cloudflare Workers' free plan, no cost involved

🔄

Load Balancing

Intelligently distributes requests across multiple providers to overcome rate limits

💰

Maximize Free Quotas

Take advantage of free tiers from different AI providers, saving costs

🔑

Multiple API Keys

Register multiple keys for the same service provider to further increase limits

🤖

OpenAI Client Compatible

Works with any library that speaks OpenAI's API format, no code changes needed

🌐

Multi-Provider Support

Aggregate Azure, DeepSeek, Aliyun, and more behind one unified API

How It Works

1

Configure Providers

Configure your AI service providers and API keys in wrangler-config.js

2

Deploy to Cloudflare

Deploy your proxy to Cloudflare Workers with a simple command

3

Use Unified API

Send requests through the OpenAI-compatible API, the proxy automatically routes to available providers

4

Break Limits

Enjoy higher request limits and lower costs while maintaining API consistency

Apex AI Proxy Architecture

Get Started in 60 Seconds

1. Clone Repository

git clone https://github.com/loadchange/apex-ai-proxy.git
cd apex-ai-proxy

2. Install Dependencies

pnpm install

3. Configure Providers

Configure your providers and API keys in wrangler-config.js, or use our visual configuration tool

4. Deploy to Cloudflare Workers

pnpm run deploy

Usage Examples

# Works with ANY OpenAI client!
from openai import OpenAI

client = OpenAI(
    base_url="https://your-proxy.workers.dev/v1",
    api_key="your-configured-api-key"
)

# Use any model you've configured in your proxy
response = client.chat.completions.create(
    model="DeepSeek-R1",  # This will be routed to one of your configured providers
    messages=[{"role": "user", "content": "Why is this proxy awesome?"}]
)
// Using the OpenAI Node.js client
import OpenAI from 'openai';

const openai = new OpenAI({
  baseURL: 'https://your-proxy.workers.dev/v1',
  apiKey: 'your-configured-api-key',
});

async function main() {
  const completion = await openai.chat.completions.create({
    model: 'DeepSeek-R1',  // This will be routed to one of your configured providers
    messages: [{ role: 'user', content: 'Why is this proxy awesome?' }],
  });

  console.log(completion.choices[0].message.content);
}
curl https://your-proxy.workers.dev/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer your-configured-api-key" \
  -d '{
    "model": "DeepSeek-R1",
    "messages": [{"role": "user", "content": "Why is this proxy awesome?"}]
  }'

Visual Configuration Tool

Easily create your wrangler-config.js file with our visual tool

Provider Configuration

Provider Name 0

Model Configuration

Model Name 0
Provider 1

Service API Key

Configuration Preview

// Your configuration will appear here
// Fill out the form and click "Generate Configuration"

Like This Project?

If Apex AI Proxy has helped you, please consider giving our GitHub repository a Star ⭐
It's crucial for our project's growth and continuous improvement!

Star Us ⭐

Help more people discover this project

Support open source development

Get notified of project updates