openai

package
v0.1.14 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Oct 20, 2025 License: MIT Imports: 10 Imported by: 265

Documentation

Overview

Package openai provides an interface to OpenAI's language models.

Token Limits

For setting token limits with OpenAI models, use openai.WithMaxCompletionTokens() for clarity. The OpenAI API now uses max_completion_tokens as the field for limiting output tokens.

// Recommended for clarity:
llm.GenerateContent(ctx, messages,
    openai.WithMaxCompletionTokens(100),
)

// Also works (backward compatible):
llm.GenerateContent(ctx, messages,
    llms.WithMaxTokens(100),
)

Both options set the same underlying field. By default, the implementation sends max_completion_tokens (modern field). For older OpenAI-compatible servers that only support max_tokens, use WithLegacyMaxTokensField():

llm.GenerateContent(ctx, messages,
    llms.WithMaxTokens(100),
    openai.WithLegacyMaxTokensField(), // Forces use of max_tokens field
)

Index

Constants

View Source
const (
	RoleSystem    = "system"
	RoleAssistant = "assistant"
	RoleUser      = "user"
	RoleFunction  = "function"
	RoleTool      = "tool"
)
View Source
const (
	DefaultAPIVersion = "2023-05-15"
)

Variables

View Source
var (
	ErrEmptyResponse              = errors.New("no response")
	ErrMissingToken               = errors.New("missing the OpenAI API key, set it in the OPENAI_API_KEY environment variable") //nolint:lll
	ErrMissingAzureModel          = errors.New("model needs to be provided when using Azure API")
	ErrMissingAzureEmbeddingModel = errors.New("embeddings model needs to be provided when using Azure API")

	ErrUnexpectedResponseLength = errors.New("unexpected length of response")
)
View Source
var ResponseFormatJSON = &ResponseFormat{Type: "json_object"} //nolint:gochecknoglobals

ResponseFormatJSON is the JSON response format.

Functions

func ExtractToolParts added in v0.1.8

func ExtractToolParts(msg *ChatMessage) ([]llms.ContentPart, []llms.ToolCall)

ExtractToolParts extracts the tool parts from a message.

func MapError added in v0.1.14

func MapError(err error) error

MapError maps OpenAI-specific errors to standardized error codes.

func WithLegacyMaxTokensField added in v0.1.14

func WithLegacyMaxTokensField() llms.CallOption

WithLegacyMaxTokensField forces the use of the max_tokens field instead of max_completion_tokens. This is useful when connecting to older OpenAI-compatible inference servers that only support the max_tokens field and don't recognize max_completion_tokens.

Usage:

llm.GenerateContent(ctx, messages,
    llms.WithMaxTokens(100),
    openai.WithLegacyMaxTokensField(), // Forces use of max_tokens field
)

func WithMaxCompletionTokens added in v0.1.14

func WithMaxCompletionTokens(maxTokens int) llms.CallOption

WithMaxCompletionTokens sets the max_completion_tokens field for token generation. This is the recommended way to limit tokens with OpenAI models.

Usage:

llm.GenerateContent(ctx, messages,
    openai.WithMaxCompletionTokens(100),
)

Note: While llms.WithMaxTokens() still works for backward compatibility, WithMaxCompletionTokens is preferred for clarity when using OpenAI.

Types

type APIType

type APIType openaiclient.APIType

type ChatMessage

type ChatMessage = openaiclient.ChatMessage

type LLM

type LLM struct {
	CallbacksHandler callbacks.Handler
	// contains filtered or unexported fields
}

func New

func New(opts ...Option) (*LLM, error)

New returns a new OpenAI LLM.

func (*LLM) Call

func (o *LLM) Call(ctx context.Context, prompt string, options ...llms.CallOption) (string, error)

Call requests a completion for the given prompt.

func (*LLM) CreateEmbedding

func (o *LLM) CreateEmbedding(ctx context.Context, inputTexts []string) ([][]float32, error)

CreateEmbedding creates embeddings for the given input texts.

func (*LLM) GenerateContent added in v0.1.4

func (o *LLM) GenerateContent(ctx context.Context, messages []llms.MessageContent, options ...llms.CallOption) (*llms.ContentResponse, error)

GenerateContent implements the Model interface.

func (*LLM) SupportsReasoning added in v0.1.14

func (o *LLM) SupportsReasoning() bool

SupportsReasoning implements the ReasoningModel interface. Returns true if the current model supports reasoning/thinking tokens.

type ModelCapability added in v0.1.14

type ModelCapability struct {
	Pattern          string // Regex pattern to match model names
	SupportsSystem   bool   // If true, supports system messages
	SupportsThinking bool   // If true, supports reasoning/thinking
	SupportsCaching  bool   // If true, supports prompt caching

}

ModelCapability defines what a model supports

type Option

type Option func(*options)

Option is a functional option for the OpenAI client.

func WithAPIType

func WithAPIType(apiType APIType) Option

WithAPIType passes the api type to the client. If not set, the default value is APITypeOpenAI.

func WithAPIVersion

func WithAPIVersion(apiVersion string) Option

WithAPIVersion passes the api version to the client. If not set, the default value is DefaultAPIVersion.

func WithBaseURL

func WithBaseURL(baseURL string) Option

WithBaseURL passes the OpenAI base url to the client. If not set, the base url is read from the OPENAI_BASE_URL environment variable. If still not set in ENV VAR OPENAI_BASE_URL, then the default value is https://api.openai.com/v1 is used.

func WithCallback added in v0.1.1

func WithCallback(callbackHandler callbacks.Handler) Option

WithCallback allows setting a custom Callback Handler.

func WithEmbeddingDimensions added in v0.1.14

func WithEmbeddingDimensions(dimensions int) Option

WithEmbeddingDimensions passes the OpenAI embeddings dimensions to the client. Requires a compatible model, test-embedding-3 or later. For more info, please check openai doc https://platform.openai.com/docs/api-reference/embeddings/create#embeddings-create-dimensions

func WithEmbeddingModel

func WithEmbeddingModel(embeddingModel string) Option

WithEmbeddingModel passes the OpenAI model to the client. Required when ApiType is Azure.

func WithHTTPClient

func WithHTTPClient(client openaiclient.Doer) Option

WithHTTPClient allows setting a custom HTTP client. If not set, the default value is http.DefaultClient.

func WithModel

func WithModel(model string) Option

WithModel passes the OpenAI model to the client. If not set, the model is read from the OPENAI_MODEL environment variable. Required when ApiType is Azure.

func WithOrganization

func WithOrganization(organization string) Option

WithOrganization passes the OpenAI organization to the client. If not set, the organization is read from the OPENAI_ORGANIZATION.

func WithResponseFormat added in v0.1.6

func WithResponseFormat(responseFormat *ResponseFormat) Option

WithResponseFormat allows setting a custom response format.

func WithToken

func WithToken(token string) Option

WithToken passes the OpenAI API token to the client. If not set, the token is read from the OPENAI_API_KEY environment variable.

type ResponseFormat added in v0.1.6

type ResponseFormat = openaiclient.ResponseFormat

ResponseFormat is the response format for the OpenAI client.

type ResponseFormatJSONSchema added in v0.1.13

type ResponseFormatJSONSchema = openaiclient.ResponseFormatJSONSchema

ResponseFormatJSONSchema is the JSON Schema response format in structured output.

type ResponseFormatJSONSchemaProperty added in v0.1.13

type ResponseFormatJSONSchemaProperty = openaiclient.ResponseFormatJSONSchemaProperty

ResponseFormatJSONSchemaProperty is the JSON Schema property in structured output.

Directories

Path Synopsis
internal

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL