google

package
v0.0.0-...-8c3fa5f Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Feb 12, 2026 License: MIT Imports: 34 Imported by: 0

Documentation

Overview

Package google provides a client implementation for interacting with Google's GenAI models. It implements the LLM Thread interface for managing conversations, tool execution, and message processing supporting both Vertex AI and Gemini API backends.

Index

Constants

This section is empty.

Variables

View Source
var ModelPricingMap = map[string]ModelPricing{

	"gemini-2.5-pro": {
		Input:             0.00125,
		InputHigh:         0.0025,
		Output:            0.01,
		OutputHigh:        0.015,
		ContextWindow:     2_097_152,
		HasThinking:       true,
		TieredPricing:     true,
		HighTierThreshold: 200_000,
	},

	"gemini-2.5-flash": {
		Input:         0.0003,
		AudioInput:    0.001,
		Output:        0.0025,
		ContextWindow: 1_048_576,
		HasThinking:   false,
		TieredPricing: false,
	},

	"gemini-2.5-flash-lite": {
		Input:         0.0001,
		AudioInput:    0.0003,
		Output:        0.0004,
		ContextWindow: 1_048_576,
		HasThinking:   false,
		TieredPricing: false,
	},

	"gemini-pro": {
		Input:             0.00125,
		InputHigh:         0.0025,
		Output:            0.01,
		OutputHigh:        0.015,
		ContextWindow:     2_097_152,
		HasThinking:       true,
		TieredPricing:     true,
		HighTierThreshold: 200_000,
	},

	"gemini-flash": {
		Input:         0.0003,
		AudioInput:    0.001,
		Output:        0.0025,
		ContextWindow: 1_048_576,
		HasThinking:   false,
		TieredPricing: false,
	},
}

ModelPricingMap contains pricing information for Google GenAI models Based on current Vertex AI pricing for Gemini 2.5 models

Functions

func DeserializeMessages

func DeserializeMessages(rawMessages []byte) ([]*genai.Content, error)

DeserializeMessages deserializes raw message bytes into Google GenAI Content objects

func ExtractMessages

func ExtractMessages(rawMessages []byte, toolResults map[string]tooltypes.StructuredToolResult) ([]llmtypes.Message, error)

ExtractMessages converts raw Google GenAI message bytes to standard message format

Types

type ModelPricing

type ModelPricing struct {
	Input             float64
	InputHigh         float64
	Output            float64
	OutputHigh        float64
	AudioInput        float64
	ContextWindow     int
	HasThinking       bool
	TieredPricing     bool
	HighTierThreshold int
}

ModelPricing holds the per-token pricing for different operations

type Response

type Response struct {
	Text         string
	ThinkingText string
	ToolCalls    []*ToolCall
	Usage        *genai.UsageMetadata
}

Response represents a response from Google's GenAI API

type Thread

type Thread struct {
	*base.Thread // Embedded base thread with shared fields and methods
	// contains filtered or unexported fields
}

Thread implements the Thread interface using Google's GenAI API. It embeds base.Thread for shared functionality across all LLM providers.

func NewGoogleThread

func NewGoogleThread(config llmtypes.Config) (*Thread, error)

NewGoogleThread creates a new thread with Google's GenAI API

func (*Thread) AddUserMessage

func (t *Thread) AddUserMessage(ctx context.Context, message string, imagePaths ...string)

AddUserMessage adds a user message with optional images to the thread

func (*Thread) CompactContext

func (t *Thread) CompactContext(ctx context.Context) error

CompactContext performs comprehensive context compacting by creating a detailed summary

func (*Thread) GetMessages

func (t *Thread) GetMessages() ([]llmtypes.Message, error)

GetMessages returns the current messages in the thread

func (*Thread) LoadConversationByID

func (t *Thread) LoadConversationByID(ctx context.Context, conversationID string) error

LoadConversationByID loads a conversation from the conversation store by ID. This is different from the loadConversation callback which loads the current conversation.

func (*Thread) Provider

func (t *Thread) Provider() string

Provider returns the name of the LLM provider for this thread

func (*Thread) SaveConversation

func (t *Thread) SaveConversation(ctx context.Context, summarise bool) error

SaveConversation persists the current conversation state to the conversation store

func (*Thread) SendMessage

func (t *Thread) SendMessage(
	ctx context.Context,
	message string,
	handler llmtypes.MessageHandler,
	opt llmtypes.MessageOpt,
) (finalOutput string, err error)

SendMessage sends a message to the LLM and processes the response

func (*Thread) ShortSummary

func (t *Thread) ShortSummary(ctx context.Context) string

ShortSummary generates a brief summary of the conversation

func (*Thread) SwapContext

func (t *Thread) SwapContext(_ context.Context, summary string) error

SwapContext replaces the conversation history with a summary message. This implements the hooks.ContextSwapper interface.

type ToolCall

type ToolCall struct {
	ID   string
	Name string
	Args map[string]any
}

ToolCall represents a tool call in Google's response format

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL