Skip to main content

Overview

GoogleVertexLLMService provides access to Google’s language models through Vertex AI while maintaining an OpenAI-compatible interface. It inherits from OpenAILLMService and supports all OpenAI features while connecting to Google’s enterprise AI services with enhanced security and compliance.

Installation

To use Google Vertex AI services, install the required dependencies:
pip install "pipecat-ai[google]"

Prerequisites

Google Cloud Setup

Before using Google Vertex AI LLM services, you need:
  1. Google Cloud Account: Sign up at Google Cloud Console
  2. Project Setup: Create a project and enable the Vertex AI API
  3. Service Account: Create a service account with Vertex AI permissions
  4. Authentication: Set up credentials via service account key or Application Default Credentials

Required Environment Variables

  • GOOGLE_APPLICATION_CREDENTIALS: Path to your service account key file (recommended)
  • Or use Application Default Credentials for cloud deployments

Configuration

credentials
str
default:"None"
JSON string of Google service account credentials for authentication.
credentials_path
str
default:"None"
Path to the service account JSON key file. Alternative to providing credentials as a string.
project_id
str
required
Google Cloud project ID.
location
str
default:"us-east4"
GCP region for the Vertex AI endpoint (e.g., "us-east4", "us-central1").
model
str
default:"gemini-2.5-flash"
Model identifier to use.
params
InputParams
default:"None"
Runtime-configurable model settings. See Google Gemini for InputParams details.
system_instruction
str
default:"None"
System instruction/prompt for the model.
tools
list
default:"None"
List of available tools/functions for the model.
tool_config
dict
default:"None"
Configuration for tool usage behavior.
http_options
HttpOptions
default:"None"
HTTP options for the Google AI client.

Usage

Basic Setup

import os
from pipecat.services.google import GoogleVertexLLMService

llm = GoogleVertexLLMService(
    credentials_path=os.getenv("GOOGLE_APPLICATION_CREDENTIALS"),
    project_id="my-gcp-project",
    location="us-east4",
    model="gemini-2.5-flash",
)

With Credentials JSON String

from pipecat.services.google import GoogleVertexLLMService

llm = GoogleVertexLLMService(
    credentials=os.getenv("GOOGLE_CREDENTIALS_JSON"),
    project_id="my-gcp-project",
    location="us-central1",
    model="gemini-2.5-flash",
    params=GoogleVertexLLMService.InputParams(
        temperature=0.7,
        top_p=0.9,
    ),
)

With Application Default Credentials

from pipecat.services.google import GoogleVertexLLMService

# Uses ADC when neither credentials nor credentials_path is provided
llm = GoogleVertexLLMService(
    project_id="my-gcp-project",
    location="us-east4",
    model="gemini-2.5-flash",
)

Notes

  • This service does not accept an api_key parameter. Use credentials, credentials_path, or Application Default Credentials instead.
  • GoogleVertexLLMService extends GoogleLLMService (not OpenAILLMService directly) and uses the Google AI Python SDK with Vertex AI authentication.
  • Authentication supports three methods: direct JSON credentials string, path to a service account key file, or Application Default Credentials (ADC).
  • The project_id parameter is required. If location is not provided, it defaults to "us-east4".