Overview
OpenRouterLLMService provides access to OpenRouter’s language models through an OpenAI-compatible interface. It inherits from OpenAILLMService and supports streaming responses, function calling, and context management with access to multiple model providers through a single API.
OpenRouter LLM API Reference
Pipecat’s API methods for OpenRouter integration
Example Implementation
Complete example with function calling
OpenRouter Documentation
Official OpenRouter API documentation and features
OpenRouter Platform
Access multiple model providers and manage API keys
Installation
To use OpenRouter services, install the required dependencies:Prerequisites
OpenRouter Account Setup
Before using OpenRouter LLM services, you need:- OpenRouter Account: Sign up at OpenRouter
- API Key: Generate an API key from your account dashboard
- Model Selection: Choose from hundreds of available models from different providers
- Credits: Add credits to your account for model usage
Required Environment Variables
OPENROUTER_API_KEY: Your OpenRouter API key for authentication
Configuration
OpenRouter API key for authentication. If not provided, the client will attempt to read from environment variables.
Model identifier to use. Uses the
provider/model format.Base URL for OpenRouter API endpoint.
InputParams
This service uses the same input parameters asOpenAILLMService. See OpenAI LLM for details.
Usage
Basic Setup
With a Different Provider Model
Notes
- OpenRouter model identifiers use the
provider/modelformat (e.g.,openai/gpt-4o,anthropic/claude-sonnet-4-20250514,google/gemini-pro). - When using Gemini models through OpenRouter, the service automatically handles the constraint that only one system message is allowed by converting additional system messages to user messages.