Skip to content

Local proxy that exposes GitHub Copilot as an OpenAI-compatible endpoint, enabling any OpenAI API client to use GitHub Copilot with secure OAuth authentication and full streaming support.

Notifications You must be signed in to change notification settings

huynguyen03dev/copilot-proxy

Repository files navigation

Copilot Proxy

⚠️ EDUCATIONAL PURPOSE ONLY This project is for educational and learning purposes only. It demonstrates API proxy patterns, authentication flows, and performance optimization techniques. Not intended for production use.

A command-line tool that exposes GitHub Copilot as an OpenAI-compatible API endpoint for educational exploration of API integration patterns.

✨ Features

  • OpenAI-Compatible API - Educational example of API compatibility layers
  • GitHub OAuth Flow - Learn device flow authentication patterns
  • Client Compatibility - Role normalization for Cline, Continue.dev, and other AI clients
  • Performance Optimizations - Connection pooling, caching, compression
  • CLI Tool - Easy installation and usage via npm
  • Cross-Platform - Works on Windows, macOS, and Linux

🚀 Quick Start

Prerequisites

  • Node.js 20+
  • Active GitHub Copilot subscription
  • Basic understanding of APIs and authentication

Installation

# Install globally via npm
npm install -g @hazeruno/copilot-proxy

# Or install locally
npm install @hazeruno/copilot-proxy

Usage

# Start server (automatically authenticates if needed)
copilot-proxy

# Custom port/host
copilot-proxy --port=3000 --host=localhost

# Manual authentication only (optional)
copilot-proxy --auth

Server runs on http://127.0.0.1:8069 by default and automatically handles authentication on startup.

📚 Learning Objectives

This project demonstrates:

  • API Proxy Patterns - How to create compatibility layers between different APIs
  • OAuth Device Flow - Modern authentication for CLI/desktop applications
  • Performance Optimization - Connection pooling, response compression, and intelligent caching
  • Fault Tolerance - Circuit breaker patterns, retries, and graceful error handling
  • TypeScript Best Practices - Clean architecture, type safety, and structured logging
  • HTTP Optimization - Keep-alive connections, response caching, and compression
  • Memory Management - Efficient token caching and connection pooling

🔧 Usage Examples

Basic API Call

curl -X POST http://127.0.0.1:8069/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "model": "gpt-4",
    "messages": [{"role": "user", "content": "Hello!"}]
  }'

With Python

import openai
client = openai.OpenAI(
    api_key="dummy-key",
    base_url="http://127.0.0.1:8069/v1"
)
response = client.chat.completions.create(
    model="gpt-4",
    messages=[{"role": "user", "content": "Hello from Python!"}]
)

With JavaScript/Node.js

import OpenAI from 'openai';

const client = new OpenAI({
  apiKey: 'dummy-key',
  baseURL: 'http://127.0.0.1:8069/v1'
});

const response = await client.chat.completions.create({
  model: 'gpt-4',
  messages: [{ role: 'user', content: 'Hello from Node.js!' }]
});

With Anthropic Claude API Format

curl -X POST http://127.0.0.1:8069/v1/messages \
  -H "Content-Type: application/json" \
  -d '{
    "model": "claude-3-5-sonnet-20241022",
    "max_tokens": 1024,
    "messages": [
      {"role": "user", "content": "Hello, Claude!"}
    ]
  }'

With Droid CLI

The /v1/messages endpoint is compatible with tools that use Anthropic's Claude API format:

# Configure your droid CLI to use the proxy
export ANTHROPIC_BASE_URL="http://127.0.0.1:8069/v1"

# Now use droid normally - it will route through the proxy to GitHub Copilot
droid "your prompt here"

📚 Programmatic Usage

You can also use copilot-proxy as a library in your Node.js applications:

import { CopilotAPIServer, GitHubCopilotAuth } from '@hazeruno/copilot-proxy';

// Check authentication status
const isAuthenticated = await GitHubCopilotAuth.isAuthenticated();

// Start server programmatically
const server = new CopilotAPIServer(8069, '127.0.0.1');
await server.start();

⚙️ Configuration

Configure via environment variables or command line arguments:

Command Line Arguments

# Default: binds to localhost only (127.0.0.1)
copilot-proxy

# Override port
copilot-proxy --port=8080

# Allow network access (bind to all interfaces)
copilot-proxy --host=0.0.0.0

# Combine options
copilot-proxy --port=8080 --host=0.0.0.0

Note: The server always defaults to 127.0.0.1 (localhost only) for security. To allow access from other machines on your network, explicitly use --host=0.0.0.0.

Environment Variables

PORT=8069                          # Server port
LOG_LEVEL=info                     # debug, info, warn, error
ENABLE_COMPRESSION=true            # Response compression (recommended)
CACHE_HEADERS=true                 # Client-side caching (recommended)
ENABLE_CONNECTION_POOLING=true     # HTTP connection pooling (recommended)

Help

copilot-proxy --help        # Show all available options

🔍 Key API Endpoints

  • POST /v1/chat/completions - Main chat endpoint (OpenAI-compatible)
  • POST /v1/messages - Anthropic Claude-compatible endpoint
  • GET /v1/models - List available models
  • GET /auth/status - Check authentication status
  • GET / - Health check and server info
  • GET /metrics - Performance metrics and monitoring data

🛠️ Development

If you want to contribute or modify the code:

# Clone the repository
git clone <repository>
cd copilot-proxy

# Install dependencies
npm install

# Build the project
npm run build

# Run the server (automatically authenticates)
node dist/cli.js

# Type checking
npm run type-check

# Run tests
npm test

🔒 Security & Disclaimers

Educational Use Only:

  • This project demonstrates best-practice API server architecture
  • Learn connection pooling, caching, compression, and fault tolerance patterns
  • Not intended for production deployment or commercial use
  • Showcases TypeScript, performance optimization, and structured logging

Security Notes:

  • Tokens stored locally in system config directory (restricted permissions)
  • Server binds to 127.0.0.1 by default for security
  • Uses GitHub's internal API endpoints (subject to change)

Compliance:

  • Ensure compliance with GitHub's Terms of Service
  • Requires active GitHub Copilot subscription
  • Use responsibly and respect rate limits

🚨 Troubleshooting

Authentication Issues:

# Clear and re-authenticate
copilot-proxy --clear-auth
copilot-proxy --auth

Common Problems:

  • "Not authenticated" → Run copilot-proxy --auth
  • "Connection refused" → Check if server is running
  • "Token expired" → Server auto-refreshes, or re-authenticate
  • "Command not found" → Install globally with npm install -g @hazeruno/copilot-proxy

📄 License

MIT License - Educational use encouraged.


⚠️ Important: This project uses GitHub's internal Copilot API endpoints for educational purposes. These endpoints are not officially documented and may change. Always ensure compliance with GitHub's Terms of Service.

About

Local proxy that exposes GitHub Copilot as an OpenAI-compatible endpoint, enabling any OpenAI API client to use GitHub Copilot with secure OAuth authentication and full streaming support.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published