Skip to main content

Overview

TogetherLLMService provides access to Together AI’s language models, including Meta’s Llama 3.1 and 3.2 models, through an OpenAI-compatible interface. It inherits from OpenAILLMService and supports streaming responses, function calling, and context management with optimized open-source model hosting.

Installation

To use Together AI services, install the required dependencies:
pip install "pipecat-ai[together]"

Prerequisites

Together AI Account Setup

Before using Together AI LLM services, you need:
  1. Together AI Account: Sign up at Together AI
  2. API Key: Generate an API key from your account dashboard
  3. Model Selection: Choose from available open-source models (Llama, Mistral, etc.)

Required Environment Variables

  • TOGETHER_API_KEY: Your Together AI API key for authentication

Configuration

api_key
str
required
Together AI API key for authentication.
base_url
str
default:"https://api.together.xyz/v1"
Base URL for Together AI API endpoint.
model
str
default:"meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo"
Model identifier to use.

InputParams

This service uses the same input parameters as OpenAILLMService. See OpenAI LLM for details.

Usage

Basic Setup

import os
from pipecat.services.together import TogetherLLMService

llm = TogetherLLMService(
    api_key=os.getenv("TOGETHER_API_KEY"),
    model="meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo",
)

With Custom Parameters

from pipecat.services.together import TogetherLLMService

llm = TogetherLLMService(
    api_key=os.getenv("TOGETHER_API_KEY"),
    model="meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo",
    params=TogetherLLMService.InputParams(
        temperature=0.7,
        top_p=0.9,
        max_completion_tokens=1024,
    ),
)

Notes

  • Together AI hosts a wide variety of open-source models. Model identifiers use the organization/model-name format.
  • Together AI fully supports the OpenAI-compatible parameter set inherited from OpenAILLMService.