Skip to content

hienhayho/lite-proxy-studio

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

2 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

LiteProxy Studio

A web-based management interface for LiteLLM Proxy, allowing you to create, manage, and run multiple litellm proxy configurations with an intuitive UI.

Features

  • Config Management: Create, edit, and manage litellm proxy configuration files
  • YAML Validation: Real-time validation of configuration syntax
  • Multi-Instance Support: Run multiple proxy instances simultaneously on different ports
  • Process Management: Start, stop, and monitor proxy instances
  • Real-time Logs: View and stream logs from running proxies
  • Responsive UI: Modern, clean interface built with Next.js and Tailwind CSS

Tech Stack

Backend

  • FastAPI: High-performance Python web framework
  • SQLite: Lightweight database for storing configurations
  • SQLAlchemy: Async ORM for database operations
  • Pydantic: Data validation and settings management
  • psutil: Process and system monitoring

Frontend

  • Next.js 15: React framework with App Router
  • TypeScript: Type-safe development
  • Tailwind CSS: Utility-first CSS framework
  • shadcn/ui: Beautiful, accessible UI components
  • TanStack Query: Data fetching and caching
  • Monaco Editor: Code editor for YAML editing

Project Structure

lite-proxy-studio/
β”œβ”€β”€ backend/                 # FastAPI backend
β”‚   β”œβ”€β”€ api/                # API endpoints
β”‚   β”‚   β”œβ”€β”€ configs.py      # Config CRUD operations
β”‚   β”‚   β”œβ”€β”€ proxies.py      # Proxy management
β”‚   β”‚   └── logs.py         # Log retrieval
β”‚   β”œβ”€β”€ database/           # Database setup
β”‚   β”‚   β”œβ”€β”€ connection.py   # SQLAlchemy setup
β”‚   β”‚   └── models.py       # Database models
β”‚   β”œβ”€β”€ services/           # Business logic
β”‚   β”‚   β”œβ”€β”€ config_service.py
β”‚   β”‚   └── proxy_service.py
β”‚   β”œβ”€β”€ schemas/            # Pydantic schemas
β”‚   β”‚   β”œβ”€β”€ config.py
β”‚   β”‚   β”œβ”€β”€ proxy.py
β”‚   β”‚   └── log.py
β”‚   β”œβ”€β”€ utils/              # Utilities
β”‚   β”‚   β”œβ”€β”€ yaml_handler.py
β”‚   β”‚   └── process_manager.py
β”‚   β”œβ”€β”€ config.py           # App configuration
β”‚   └── main.py             # FastAPI app entry
β”œβ”€β”€ frontend/               # Next.js frontend
β”‚   β”œβ”€β”€ src/
β”‚   β”‚   β”œβ”€β”€ app/           # App routes
β”‚   β”‚   β”œβ”€β”€ components/    # React components
β”‚   β”‚   β”œβ”€β”€ lib/           # Utilities and API client
β”‚   β”‚   └── hooks/         # Custom React hooks
β”‚   └── components.json    # shadcn/ui config
β”œβ”€β”€ data/                  # Application data
β”‚   β”œβ”€β”€ configs/           # Stored YAML configs
β”‚   β”œβ”€β”€ logs/              # Proxy logs
β”‚   └── database.db        # SQLite database
└── README.md

Prerequisites

  • Python 3.11+
  • Node.js 18+
  • pnpm (for frontend package management)
  • uv (for Python package management)
  • litellm (install via pip: pip install litellm)

Installation

1. Clone the repository

git clone <repository-url>
cd lite-proxy-studio

2. Backend Setup

cd backend

# Install dependencies (uv will create a virtual environment automatically)
uv sync

# Or manually activate and install
source .venv/bin/activate  # On Windows: .venv\Scripts\activate

3. Frontend Setup

cd frontend

# Install dependencies
pnpm install

4. Install LiteLLM

pip install litellm
# Or with uv
uv pip install litellm

Running the Application

Start Backend

cd backend

# Using uv
uv run python main.py

# Or with activated virtualenv
python main.py

# Or with uvicorn directly
uvicorn main:app --host 0.0.0.0 --port 8080 --reload

The backend API will be available at:

Start Frontend

cd frontend

# Development mode
pnpm dev

# Or
pnpm run dev

The frontend will be available at: http://localhost:3000

Environment Variables

Backend

Create a .env file in the backend directory (optional):

# Database
DATABASE_URL=sqlite+aiosqlite:///./data/database.db

# CORS
CORS_ORIGINS=http://localhost:3000,http://localhost:3001

Frontend

Create a .env.local file in the frontend directory:

NEXT_PUBLIC_API_URL=http://localhost:8080/api

API Endpoints

Configs

  • GET /api/configs - List all configurations
  • POST /api/configs - Create a new configuration
  • GET /api/configs/{id} - Get a specific configuration
  • PUT /api/configs/{id} - Update a configuration
  • DELETE /api/configs/{id} - Delete a configuration
  • POST /api/configs/validate - Validate YAML configuration

Proxies

  • GET /api/proxies - List all proxy instances
  • POST /api/proxies/start - Start a new proxy instance
  • POST /api/proxies/{id}/stop - Stop a proxy instance
  • GET /api/proxies/{id} - Get proxy instance details
  • GET /api/proxies/{id}/status - Get detailed proxy status

Logs

  • GET /api/logs/{instance_id} - Get logs for a proxy instance

Usage

1. Create a Configuration

  1. Navigate to the Configs page
  2. Click "New Config"
  3. Enter a name and description
  4. Write or paste your litellm YAML configuration
  5. The editor will validate syntax in real-time
  6. Click "Save" to create the configuration

Example minimal config:

model_list:
  - model_name: gpt-4
    litellm_params:
      model: gpt-4
      api_key: your-openai-key

litellm_settings:
  drop_params: true

2. Start a Proxy

  1. Select a configuration from the list
  2. Choose an available port (8000-9999)
  3. Click "Start Proxy"
  4. Monitor the status and logs in real-time

3. Manage Running Proxies

  • View all running proxies on the Proxies page
  • Check status, uptime, and resource usage
  • View real-time logs
  • Stop proxies when no longer needed

Development

Backend Development

cd backend

# Run with auto-reload
uvicorn main:app --reload --host 0.0.0.0 --port 8080

# Run tests (if added)
pytest

Frontend Development

cd frontend

# Development server with hot reload
pnpm dev

# Type checking
pnpm run type-check

# Linting
pnpm run lint

# Build for production
pnpm run build

Configuration Validation

The application validates litellm configurations against common sections:

  • model_list: Array of model configurations
  • litellm_settings: LiteLLM-specific settings
  • general_settings: General proxy settings
  • router_settings: Routing and load balancing
  • environment_variables: Environment configuration

Unknown sections will generate warnings but won't prevent saving.

Troubleshooting

Port Already in Use

If you get a "port already in use" error:

  1. Check running proxies in the Proxies page
  2. Stop any conflicting proxy instances
  3. Or choose a different port

LiteLLM Not Found

Make sure litellm is installed and available in your PATH:

which litellm
# or
litellm --version

Database Errors

The SQLite database is created automatically in data/database.db. If you encounter issues:

rm data/database.db
# Restart the backend to recreate the database

Future Enhancements

  • WebSocket support for real-time log streaming
  • Config import/export functionality
  • Proxy health monitoring and alerts
  • Rate limiting and quota management
  • User authentication and authorization
  • Docker containerization
  • Proxy usage analytics and metrics
  • Config templates and examples
  • Batch operations on multiple proxies

License

MIT

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Support

For issues and questions, please open an issue on GitHub.

About

LiteLLM Proxy UI Management

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •