Skip to main content
Uno LLM Gateway is a high-performance, open-source gateway written in Golang that provides a unified interface to interact with large language models from various providers. It sits between your application and LLM providers, offering access control via virtual keys, centralized configuration, and deep observability through OpenTelemetry traces.

Key Features

  • Unified API: Interact with different LLM providers using a single, consistent request/response format
  • Virtual Keys: Protect your provider API keys by using Uno-generated virtual keys in your client applications
  • Centralized Configuration: Manage provider settings through a user-friendly UI without changing code
  • Deep Observability: Automatically capture traces, token usage, and latency metrics for every request using ClickHouse and OpenTelemetry
  • No-Code Agent Builder: Build and deploy AI agents with tools, conversation history, and structured outputs—all from the dashboard

Capabilities

Uno Gateway provides two primary capabilities:

LLM Gateway

Point any OpenAI, Anthropic, or Gemini SDK at the gateway and unlock virtual keys, rate limiting, and request logging.

Virtual Keys

Protect your provider API keys. Generate virtual keys with usage limits and revoke them instantly when needed.

Drop-in Replacement

Point your existing OpenAI, Anthropic, or Google GenAI client to the gateway and get access to all gateway features.

Deep Observability

Track every request with OpenTelemetry. Monitor token usage, latency, and costs with ClickHouse-powered analytics.

Request Logging

Every LLM call is logged with full request/response details for debugging and compliance.

Drop-in Replacement

Point any existing SDK to the gateway—no code changes required. Simply update the base URL and use your virtual key: Go:
client := openai.NewClient(
    option.WithBaseURL("http://localhost:6060/api/gateway/openai"),
    option.WithAPIKey("your-virtual-key"),
)
Python:
client = OpenAI(
    base_url="http://localhost:6060/api/gateway/openai",
    api_key="your-virtual-key",
)
TypeScript:
const client = new OpenAI({
    baseURL: "http://localhost:6060/api/gateway/openai",
    apiKey: "your-virtual-key",
});

Key Management

Manage provider API keys and virtual keys with rate limits, all from a centralized dashboard.
Uno Gateway Dashboard

Gateway Logs

View all LLM requests with full details—prompts, responses, token usage, and latency metrics.
LLM Gateway Logs

No-Code Agent Builder

Beyond proxying LLM calls, Uno Gateway includes a complete no-code agent builder. Define model parameters, system prompts, MCP tool servers, and output schemas—all from the dashboard, without writing code.
No-Code Agent Builder

Conversational UI

Interact with your agents through a built-in chat interface. Test prompts, debug tool calls, and iterate on your agents in real-time without leaving the dashboard.
Conversational UI for Agents

Agent Traces

Every agent invocation is captured with detailed traces. Debug complex multi-step workflows, monitor performance, and understand exactly how your agents process requests.
Agent Traces and Observability

Supported Providers

Uno Gateway currently supports OpenAI, Anthropic, and Gemini. Support for additional providers is on the roadmap.
ProviderTextImageTool CallsReasoning
OpenAI
Anthropic
Gemini

Next Steps