Skip to main content

Overview

SambaNovaLLMService provides access to SambaNova’s language models through an OpenAI-compatible interface. It inherits from OpenAILLMService and supports streaming responses, function calling, and context management with SambaNova’s high-performance inference platform.

Installation

To use SambaNova services, install the required dependencies:
pip install "pipecat-ai[sambanova]"

Prerequisites

SambaNova Account Setup

Before using SambaNova LLM services, you need:
  1. SambaNova Account: Sign up at SambaNova Cloud
  2. API Key: Generate an API key from your account dashboard
  3. Model Selection: Choose from available high-performance models

Required Environment Variables

  • SAMBANOVA_API_KEY: Your SambaNova API key for authentication

Configuration

api_key
str
required
SambaNova API key for authentication.
model
str
default:"Llama-4-Maverick-17B-128E-Instruct"
Model identifier to use.
base_url
str
default:"https://api.sambanova.ai/v1"
Base URL for SambaNova API endpoint.

InputParams

This service uses the same input parameters as OpenAILLMService. See OpenAI LLM for details.

Usage

Basic Setup

import os
from pipecat.services.sambanova import SambaNovaLLMService

llm = SambaNovaLLMService(
    api_key=os.getenv("SAMBANOVA_API_KEY"),
    model="Llama-4-Maverick-17B-128E-Instruct",
)

With Custom Parameters

from pipecat.services.sambanova import SambaNovaLLMService

llm = SambaNovaLLMService(
    api_key=os.getenv("SAMBANOVA_API_KEY"),
    model="Llama-4-Maverick-17B-128E-Instruct",
    params=SambaNovaLLMService.InputParams(
        temperature=0.7,
        top_p=0.9,
        max_tokens=1024,
    ),
)

Notes

  • SambaNova does not support frequency_penalty, presence_penalty, or seed parameters.
  • SambaNova has custom handling for tool call indexing. The service includes compatibility logic for processing function calls from the SambaNova API.
  • SambaNova is known for high-throughput inference on large language models.