llm

package module
v0.0.5 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Feb 10, 2025 License: MIT Imports: 13 Imported by: 0

README

LLM 🔗

What is this?

This is a very small subset of the Langchain project in Go.

[!Important] It is quite heavily based on the more ambitious LangChainGo, for now you should likely use that instead.

My main objective with this module is to have declarations of some core interfaces (Such as llm.Provider and llm.Chain) and other types that I can use as building blocks for LLM based Go applications. I do not intend for this module to contain many (or even any) implementations of these interfaces.

Examples

See ./examples for example usage.

License (MIT)

Since LangChainGo is licensed under MIT it goes without saying that this project should be so as well 📋

Documentation

Overview

Package llm implements a very small subset of the langchain project in Go.

Index

Constants

This section is empty.

Variables

View Source
var (
	// ErrWrongOutputTypeInRun is returned in the run function if the chain returns a value that is not a string.
	ErrWrongOutputTypeInRun = errors.New("run not supported in chain that returns value that is not string")
	// ErrUnexpectedChatMessageType is returned when a chat message is of an unexpected type.
	ErrUnexpectedChatMessageType = errors.New("unexpected chat message type")
	// ErrUnableToParseOutput is returned if the output of the llm is unparsable.
	ErrUnableToParseOutput = errors.New("unable to parse agent output")
	// ErrOutputNotStringInPredict is returned if a chain does not return a string in the predict function.
	ErrOutputNotStringInPredict = errors.New("predict is not supported with a chain that does not return a string")
	// ErrNotFinished is returned if the agent does not give a finish before the number of iterations is larger than max iterations.
	ErrNotFinished = errors.New("agent not finished before max iterations")
	// ErrNeedChatMessageList is returned when the variable is not a list of chat messages.
	ErrNeedChatMessageList = errors.New("variable should be a list of chat messages")
	// ErrMultipleOutputsInRun is returned in the run function if the chain expects more then one output values.
	ErrMultipleOutputsInRun = errors.New("run not supported in chain with more then one expected output")
	// ErrMultipleOutputsInPredict is returned if a chain has multiple return values in predict.
	ErrMultipleOutputsInPredict = errors.New("predict is not supported with a chain that returns multiple values")
	// ErrMultipleInputsInRun is returned in the run function if the chain expects more then one input values.
	ErrMultipleInputsInRun = errors.New("run not supported in chain with more then one expected input")
	// ErrMissingMemoryKeyValues is returned when some expected input values keys to a chain is missing.
	ErrMissingMemoryKeyValues = errors.New("missing memory key in input values")
	// ErrMissingInputValues is returned when some expected input values keys to a chain is missing.
	ErrMissingInputValues = errors.New("missing key in input values")
	// ErrMismatchMetadatasAndText is returned when the number of texts and metadatas given to CreateDocuments does not match. The function will not error if the length of the metadatas slice is zero.
	ErrMismatchMetadatasAndText = errors.New("number of texts and metadatas does not match")
	// ErrMemoryValuesWrongType is returned if the memory value to a chain is of wrong type.
	ErrMemoryValuesWrongType = errors.New("memory key is of wrong type")
	// ErrInvalidPartialVariableType is returned when the partial variable is not a string or a function.
	ErrInvalidPartialVariableType = errors.New("invalid partial variable type")
	// ErrInvalidOutputValues is returned when expected output keys to a chain does not match the actual keys in the return output values map.
	ErrInvalidOutputValues = errors.New("missing key in output values")
	// ErrInvalidOptions is returned if the options given to the initializer is invalid.
	ErrInvalidOptions = errors.New("invalid options")
	// ErrInvalidInputValues is returned if the input values to a chain is invalid.
	ErrInvalidInputValues = errors.New("invalid input values")
	// ErrInvalidValues is returned if the input values to a chain is invalid.
	ErrInvalidValues = errors.New("invalid values")
	// ErrInputValuesWrongType is returned if an input value to a chain is of wrong type.
	ErrInputValuesWrongType = errors.New("input key is of wrong type")
	// ErrExecutorInputNotString is returned if an input to the executor call function is not a string.
	ErrExecutorInputNotString = errors.New("input to executor not string")
	// ErrEmptyResponseFromProvider is returned when there was an empty response from the provider.
	ErrEmptyResponseFromProvider = fmt.Errorf("empty response from provider")
	// ErrChainInitialization is returned if a chain is not initialized appropriately.
	ErrChainInitialization = errors.New("error initializing chain")
	// ErrAgentNoReturn is returned if the agent returns no actions and no finish.
	ErrAgentNoReturn = errors.New("no actions or finish was returned by the agent")
)

Functions

func BufferString

func BufferString(messages []ChatMessage, humanPrefix string, aiPrefix string) (string, error)

BufferString gets the buffer string of messages.

func Call

func Call(ctx context.Context, provider Provider, prompt string, options ...ContentOption) (string, error)

Call is a convenience function for calling an LLM provider with a single string prompt, expecting a single string response. It's useful for simple, string-only interactions and provides a slightly more ergonomic API than the more general [Provider.GenerateContent].

func ChainApply

func ChainApply(ctx context.Context, c Chain, inputValues []map[string]any, maxWorkers int, options ...ChainOption) ([]map[string]any, error)

ChainApply executes the chain for each of the inputs asynchronously.

func ChainCall

func ChainCall(ctx context.Context, c Chain, inputValues map[string]any, options ...ChainOption) (map[string]any, error)

ChainCall is the standard function used for executing chains.

func ChainPredict

func ChainPredict(ctx context.Context, c Chain, inputValues map[string]any, options ...ChainOption) (string, error)

ChainPredict can be used to execute a chain if the chain only expects one string output.

func ChainRun

func ChainRun(ctx context.Context, c Chain, input any, options ...ChainOption) (string, error)

ChainRun can be used to execute a chain if the chain only expects one input and one string output.

func ResolvePartialValues

func ResolvePartialValues(partialValues map[string]any, values map[string]any) (map[string]any, error)

Types

type AIChatMessage

type AIChatMessage struct {
	// Content is the content of the message.
	Content string `json:"content,omitempty"`

	// FunctionCall represents the model choosing to call a function.
	FunctionCall *FunctionCall `json:"function_call,omitempty"`

	// ToolCalls represents the model choosing to call tools.
	ToolCalls []ToolCall `json:"tool_calls,omitempty"`
}

AIChatMessage is a message sent by an AI.

func (AIChatMessage) MessageContent

func (ai AIChatMessage) MessageContent() string

func (AIChatMessage) MessageFunctionCall

func (ai AIChatMessage) MessageFunctionCall() *FunctionCall

func (AIChatMessage) Type

func (ai AIChatMessage) Type() ChatMessageType

type Agent

type Agent interface {
	// Plan Given an input and previous steps decide what to do next. Returns either actions or a finish.
	Plan(ctx context.Context, steps []AgentStep, inputs map[string]string) ([]AgentAction, *AgentFinish, error)
	InputKeys() []string
	OutputKeys() []string
	Tools() []AgentTool
}

Agent is the interface all agents must implement.

type AgentAction

type AgentAction struct {
	Tool      string
	ToolInput string
	Log       string
	ToolID    string
}

AgentAction is the agent's action to take.

type AgentFinish

type AgentFinish struct {
	ReturnValues map[string]any
	Log          string
}

AgentFinish is the agent's return value.

type AgentHooker

type AgentHooker interface {
	AgentHooks() AgentHooks
}

A AgentHooker can return its AgentHooks

type AgentHooks

type AgentHooks interface {
	AgentAction(ctx context.Context, action AgentAction)
	AgentFinish(ctx context.Context, finish AgentFinish)

	StreamingFunc(ctx context.Context, chunk []byte)
}

AgentHooks contains the hooks that can be used by an agent.

type AgentStep

type AgentStep struct {
	Action      AgentAction
	Observation string
}

AgentStep is a step of the agent.

type AgentTool

type AgentTool interface {
	Name() string
	Description() string
	Call(ctx context.Context, input string) (string, error)
}

AgentTool is a tool for the LLM agent to interact with different applications.

type BinaryContent

type BinaryContent struct {
	MIMEType string
	Data     []byte
}

BinaryContent is content holding some binary data with a MIME type.

func BinaryPart

func BinaryPart(mime string, data []byte) BinaryContent

BinaryPart creates a new BinaryContent from the given MIME type (e.g. "image/png" and binary data).

func (BinaryContent) String

func (bc BinaryContent) String() string

type Chain

type Chain interface {
	// Call runs the logic of the chain and returns the output. This method should
	// not be called directly. Use rather the chains.Call, chains.Run or chains.Predict
	// functions that handles the memory and other aspects of the chain.
	Call(ctx context.Context, inputs map[string]any, options ...ChainOption) (map[string]any, error)
	// GetMemory gets the memory of the chain.
	Memory() Memory
	// InputKeys returns the input keys the chain expects.
	InputKeys() []string
	// OutputKeys returns the output keys the chain returns.
	OutputKeys() []string
}

Chain is the interface all chains must implement.

func NewChain

func NewChain(provider Provider, prompter PromptFormatter, opts ...ChainOption) Chain

NewChain chain with a LLM provider and a prompt.

type ChainHooker

type ChainHooker interface {
	ChainHooks() ChainHooks
}

A ChainHooker can return its ChainHooks

type ChainHooks

type ChainHooks interface {
	ChainStart(ctx context.Context, inputs map[string]any)
	ChainEnd(ctx context.Context, outputs map[string]any)
	ChainError(ctx context.Context, err error)

	StreamingFunc(ctx context.Context, chunk []byte)
}

ChainHooks contains the hooks that can be used by a chain.

type ChainOption

type ChainOption func(*ChainOptions)

ChainOption is a function that configures ChainOptions.

func ChainWithHooks

func ChainWithHooks(hooks ChainHooks) ChainOption

ChainWithHooks allows setting custom Hooks.

func ChainWithMaxLength

func ChainWithMaxLength(maxLength int) ChainOption

ChainWithMaxLength will add an option to set the maximum length of the generated text for LLM.Call.

func ChainWithMaxTokens

func ChainWithMaxTokens(maxTokens int) ChainOption

ChainWithMaxTokens is an option for LLM.Call.

func ChainWithMemory

func ChainWithMemory(memory Memory) ChainOption

ChainWithMemory allows setting what memory should be used by the Chain.

func ChainWithMinLength

func ChainWithMinLength(minLength int) ChainOption

ChainWithMinLength will add an option to set the minimum length of the generated text for LLM.Call.

func ChainWithModel

func ChainWithModel(model string) ChainOption

ChainWithModel is an option for LLM.Call.

func ChainWithOutputKey

func ChainWithOutputKey(outputKey string) ChainOption

ChainWithOutputKey allows setting what output key should be used by the Chain. (Defaults to "text")

func ChainWithParser

func ChainWithParser(parser Parser[any]) ChainOption

ChainWithParser allows setting what parser should be used by the Chain.

func ChainWithRepetitionPenalty

func ChainWithRepetitionPenalty(repetitionPenalty float64) ChainOption

ChainWithRepetitionPenalty will add an option to set the repetition penalty for sampling.

func ChainWithSeed

func ChainWithSeed(seed int) ChainOption

ChainWithSeed will add an option to use deterministic sampling for LLM.Call.

func ChainWithStopWords

func ChainWithStopWords(stopWords []string) ChainOption

ChainWithStopWords is an option for setting the stop words for LLM.Call.

func ChainWithStreamingFunc

func ChainWithStreamingFunc(streamingFunc func(ctx context.Context, chunk []byte) error) ChainOption

ChainWithStreamingFunc is an option for LLM.Call that allows streaming responses.

func ChainWithTemperature

func ChainWithTemperature(temperature float64) ChainOption

ChainWithTemperature is an option for LLM.Call.

func ChainWithTopK

func ChainWithTopK(topK int) ChainOption

ChainWithTopK will add an option to use top-k sampling for LLM.Call.

func ChainWithTopP

func ChainWithTopP(topP float64) ChainOption

ChainWithTopP will add an option to use top-p sampling for LLM.Call.

type ChainOptions

type ChainOptions struct {
	// Model is the model to use in an LLM call.
	Model string

	// MaxTokens is the maximum number of tokens to generate to use in an LLM call.
	MaxTokens int

	// Temperature is the temperature for sampling to use in an LLM call, between 0 and 1.
	Temperature float64

	// StopWords is a list of words to stop on to use in an LLM call.
	StopWords []string

	// StreamingFunc is a function to be called for each chunk of a streaming response.
	// Return an error to stop streaming early.
	StreamingFunc func(ctx context.Context, chunk []byte) error

	// TopK is the number of tokens to consider for top-k sampling in an LLM call.
	TopK int

	// TopP is the cumulative probability for top-p sampling in an LLM call.
	TopP float64

	// Seed is a seed for deterministic sampling in an LLM call.
	Seed int

	// MinLength is the minimum length of the generated text in an LLM call.
	MinLength int

	// MaxLength is the maximum length of the generated text in an LLM call.
	MaxLength int

	// RepetitionPenalty is the repetition penalty for sampling in an LLM call.
	RepetitionPenalty float64

	// OutputKey to use by the Chain.
	OutputKey string

	// Hooks for the Chain
	Hooks ChainHooks

	// Parser to use by the Chain.
	Parser Parser[any]

	// Memory to use by the Chain.
	Memory Memory
	// contains filtered or unexported fields
}

Options for a chain.

type ChatMessage

type ChatMessage interface {
	// Type gets the type of the message.
	Type() ChatMessageType
	// MessageContent gets the content of the message.
	MessageContent() string
}

ChatMessage represents a message in a chat.

type ChatMessageHistory

type ChatMessageHistory interface {
	// AddMessage adds a message to the store.
	AddMessage(ctx context.Context, message ChatMessage) error

	// AddUserMessage is a convenience method for adding a human message string
	// to the store.
	AddUserMessage(ctx context.Context, message string) error

	// AddAIMessage is a convenience method for adding an AI message string to
	// the store.
	AddAIMessage(ctx context.Context, message string) error

	// Clear removes all messages from the store.
	Clear(ctx context.Context) error

	// Messages retrieves all messages from the store
	Messages(ctx context.Context) ([]ChatMessage, error)

	// SetMessages replaces existing messages in the store
	SetMessages(ctx context.Context, messages []ChatMessage) error
}

ChatMessageHistory is the interface for chat history in memory/store.

type ChatMessageType

type ChatMessageType string

ChatMessageType is the type of chat message.

const (
	// ChatMessageTypeAI is a message sent by an AI.
	ChatMessageTypeAI ChatMessageType = "ai"
	// ChatMessageTypeHuman is a message sent by a human.
	ChatMessageTypeHuman ChatMessageType = "human"
	// ChatMessageTypeSystem is a message sent by the system.
	ChatMessageTypeSystem ChatMessageType = "system"
	// ChatMessageTypeGeneric is a message sent by a generic user.
	ChatMessageTypeGeneric ChatMessageType = "generic"
	// ChatMessageTypeFunction is a message sent by a function.
	ChatMessageTypeFunction ChatMessageType = "function"
	// ChatMessageTypeTool is a message sent by a tool.
	ChatMessageTypeTool ChatMessageType = "tool"
)

type ContentChoice

type ContentChoice struct {
	// Content is the textual content of a response
	Content string

	// StopReason is the reason the model stopped generating output.
	StopReason string

	// GenerationInfo is arbitrary information the model adds to the response.
	GenerationInfo map[string]any

	// FuncCall is non-nil when the model asks to invoke a function/tool.
	// If a model invokes more than one function/tool, this field will only
	// contain the first one.
	FuncCall *FunctionCall

	// ToolCalls is a list of tool calls the model asks to invoke.
	ToolCalls []ToolCall
}

ContentChoice is one of the response choices returned by GenerateContent calls.

type ContentOption

type ContentOption func(*ContentOptions)

ContentOption is a function that configures Options.

func WithCandidateCount

func WithCandidateCount(c int) ContentOption

WithCandidateCount specifies the number of response candidates to generate.

func WithFrequencyPenalty

func WithFrequencyPenalty(frequencyPenalty float64) ContentOption

WithFrequencyPenalty will add an option to set the frequency penalty for sampling.

func WithJSONMode

func WithJSONMode() ContentOption

WithJSONMode will add an option to set the response format to JSON. This is useful for models that return structured data.

func WithMaxLength

func WithMaxLength(maxLength int) ContentOption

WithMaxLength will add an option to set the maximum length of the generated text.

func WithMaxTokens

func WithMaxTokens(maxTokens int) ContentOption

WithMaxTokens specifies the max number of tokens to generate.

func WithMetadata

func WithMetadata(metadata map[string]any) ContentOption

WithMetadata will add an option to set metadata to include in the request. The meaning of this field is specific to the backend in use.

func WithMinLength

func WithMinLength(minLength int) ContentOption

WithMinLength will add an option to set the minimum length of the generated text.

func WithModel

func WithModel(model string) ContentOption

WithModel specifies which model name to use.

func WithN

func WithN(n int) ContentOption

WithN will add an option to set how many chat completion choices to generate for each input message.

func WithOptions

func WithOptions(options ContentOptions) ContentOption

WithOptions specifies options.

func WithPresencePenalty

func WithPresencePenalty(presencePenalty float64) ContentOption

WithPresencePenalty will add an option to set the presence penalty for sampling.

func WithRepetitionPenalty

func WithRepetitionPenalty(repetitionPenalty float64) ContentOption

WithRepetitionPenalty will add an option to set the repetition penalty for sampling.

func WithSeed

func WithSeed(seed int) ContentOption

WithSeed will add an option to use deterministic sampling.

func WithStopWords

func WithStopWords(stopWords []string) ContentOption

WithStopWords specifies a list of words to stop generation on.

func WithStreamingFunc

func WithStreamingFunc(streamingFunc func(ctx context.Context, chunk []byte) error) ContentOption

WithStreamingFunc specifies the streaming function to use.

func WithTemperature

func WithTemperature(temperature float64) ContentOption

WithTemperature specifies the model temperature, a hyperparameter that regulates the randomness, or creativity, of the AI's responses.

func WithToolChoice

func WithToolChoice(choice any) ContentOption

WithToolChoice will add an option to set the choice of tool to use. It can either be "none", "auto" (the default behavior), or a specific tool as described in the ToolChoice type.

func WithTools

func WithTools(tools []Tool) ContentOption

WithTools will add an option to set the tools to use.

func WithTopK

func WithTopK(topK int) ContentOption

WithTopK will add an option to use top-k sampling.

func WithTopP

func WithTopP(topP float64) ContentOption

WithTopP will add an option to use top-p sampling.

type ContentOptions

type ContentOptions struct {
	// Model is the model to use.
	Model string `json:"model"`
	// CandidateCount is the number of response candidates to generate.
	CandidateCount int `json:"candidate_count"`
	// MaxTokens is the maximum number of tokens to generate.
	MaxTokens int `json:"max_tokens"`
	// Temperature is the temperature for sampling, between 0 and 1.
	Temperature float64 `json:"temperature"`
	// StopWords is a list of words to stop on.
	StopWords []string `json:"stop_words"`
	// StreamingFunc is a function to be called for each chunk of a streaming response.
	// Return an error to stop streaming early.
	StreamingFunc func(ctx context.Context, chunk []byte) error `json:"-"`
	// TopK is the number of tokens to consider for top-k sampling.
	TopK int `json:"top_k"`
	// TopP is the cumulative probability for top-p sampling.
	TopP float64 `json:"top_p"`
	// Seed is a seed for deterministic sampling.
	Seed int `json:"seed"`
	// MinLength is the minimum length of the generated text.
	MinLength int `json:"min_length"`
	// MaxLength is the maximum length of the generated text.
	MaxLength int `json:"max_length"`
	// N is how many chat completion choices to generate for each input message.
	N int `json:"n"`
	// RepetitionPenalty is the repetition penalty for sampling.
	RepetitionPenalty float64 `json:"repetition_penalty"`
	// FrequencyPenalty is the frequency penalty for sampling.
	FrequencyPenalty float64 `json:"frequency_penalty"`
	// PresencePenalty is the presence penalty for sampling.
	PresencePenalty float64 `json:"presence_penalty"`

	// JSONMode is a flag to enable JSON mode.
	JSONMode bool `json:"json"`

	// Tools is a list of tools to use. Each tool can be a specific tool or a function.
	Tools []Tool `json:"tools,omitempty"`
	// ToolChoice is the choice of tool to use, it can either be "none", "auto" (the default behavior),
	// or a specific tool as described in the ToolChoice type.
	ToolChoice any `json:"tool_choice"`

	// Metadata is a map of metadata to include in the request.
	// The meaning of this field is specific to the backend in use.
	Metadata map[string]any `json:"metadata,omitempty"`
}

ContentOptions is a set of options for calling models. Not all models support all options.

type ContentPart

type ContentPart interface {
	// contains filtered or unexported methods
}

ContentPart is an interface all parts of content have to implement.

type ContentResponse

type ContentResponse struct {
	Choices []*ContentChoice
}

ContentResponse is the response returned by a GenerateContent call. It can potentially return multiple content choices.

func Content

func Content(ctx context.Context, provider Provider, prompt string, options ...ContentOption) (*ContentResponse, error)

Content is a convenience function for calling an LLM provider with a single string prompt.

type Document

type Document struct {
	PageContent string
	Metadata    map[string]any
	Score       float32
}

Document structure used in LLM applications.

func CreateDocuments

func CreateDocuments(textSplitter TextSplitter, texts []string, metadatas []map[string]any) ([]Document, error)

CreateDocuments creates documents from texts and metadatas with a text splitter. If the length of the metadatas is zero, the result documents will contain no metadata. Otherwise, the numbers of texts and metadatas must match.

func SplitDocuments

func SplitDocuments(textSplitter TextSplitter, documents []Document) ([]Document, error)

SplitDocuments splits documents using a textsplitter.

type Embedder

type Embedder interface {
	// EmbedDocuments returns a vector for each text.
	EmbedDocuments(ctx context.Context, texts []string) ([][]float32, error)
	// EmbedQuery embeds a single text.
	EmbedQuery(ctx context.Context, text string) ([]float32, error)
}

Embedder is the interface for creating vector embeddings from texts.

func NewEmbedder added in v0.0.2

func NewEmbedder(client EmbedderClient, opts ...EmbedderOption) (Embedder, error)

NewEmbedder creates a new Embedder from the given EmbedderClient, with some options that affect how embedding will be done.

type EmbedderClient

type EmbedderClient interface {
	CreateEmbedding(ctx context.Context, texts []string) ([][]float32, error)
}

EmbedderClient is the interface LLM clients implement for embeddings.

type EmbedderClientFunc

type EmbedderClientFunc func(ctx context.Context, texts []string) ([][]float32, error)

EmbedderClientFunc is an adapter to allow the use of ordinary functions as Embedder Clients. If `f` is a function with the appropriate signature, `EmbedderClientFunc(f)` is an `EmbedderClient` that calls `f`.

func (EmbedderClientFunc) CreateEmbedding

func (e EmbedderClientFunc) CreateEmbedding(ctx context.Context, texts []string) ([][]float32, error)

type EmbedderOption added in v0.0.2

type EmbedderOption func(p *embedder)

func WithBatchSize added in v0.0.2

func WithBatchSize(batchSize int) EmbedderOption

WithBatchSize is an option for specifying the batch size.

func WithStripNewLines added in v0.0.2

func WithStripNewLines(stripNewLines bool) EmbedderOption

WithStripNewLines is an option for specifying the should it strip new lines.

type EmptyHooks

type EmptyHooks struct{}

EmptyHooks hooks that does nothing. Useful for embedding.

func (EmptyHooks) AgentAction

func (EmptyHooks) AgentAction(context.Context, AgentAction)

func (EmptyHooks) AgentFinish

func (EmptyHooks) AgentFinish(context.Context, AgentFinish)

func (EmptyHooks) ChainEnd

func (EmptyHooks) ChainEnd(context.Context, map[string]any)

func (EmptyHooks) ChainError

func (EmptyHooks) ChainError(context.Context, error)

func (EmptyHooks) ChainStart

func (EmptyHooks) ChainStart(context.Context, map[string]any)

func (EmptyHooks) ProviderError

func (EmptyHooks) ProviderError(context.Context, error)

func (EmptyHooks) ProviderGenerateContentEnd

func (EmptyHooks) ProviderGenerateContentEnd(context.Context, *ContentResponse)

func (EmptyHooks) ProviderGenerateContentStart

func (EmptyHooks) ProviderGenerateContentStart(context.Context, []Message)

func (EmptyHooks) ProviderStart

func (EmptyHooks) ProviderStart(context.Context, []string)

func (EmptyHooks) RetrieverEnd

func (EmptyHooks) RetrieverEnd(context.Context, string, []Document)

func (EmptyHooks) RetrieverStart

func (EmptyHooks) RetrieverStart(context.Context, string)

func (EmptyHooks) StreamingFunc

func (EmptyHooks) StreamingFunc(context.Context, []byte)

func (EmptyHooks) Text

func (EmptyHooks) ToolEnd

func (EmptyHooks) ToolEnd(context.Context, string)

func (EmptyHooks) ToolError

func (EmptyHooks) ToolError(context.Context, error)

func (EmptyHooks) ToolStart

func (EmptyHooks) ToolStart(context.Context, string)

type EmptyMemory

type EmptyMemory struct{}

EmptyMemory that does nothing. Useful for embedding.

func (EmptyMemory) Clear

func (EmptyMemory) LoadVariables

func (EmptyMemory) LoadVariables(context.Context, map[string]any) (map[string]any, error)

func (EmptyMemory) MemoryKey

func (EmptyMemory) MemoryKey(context.Context) string

func (EmptyMemory) SaveContext

func (EmptyMemory) SaveContext(context.Context, map[string]any, map[string]any) error

func (EmptyMemory) Variables

func (EmptyMemory) Variables(context.Context) []string

type EmptyParser

type EmptyParser struct{}

EmptyParser that does nothing. Useful for embedding.

func (EmptyParser) FormatInstructions

func (EmptyParser) FormatInstructions() string

func (EmptyParser) Parse

func (EmptyParser) Parse(text string) (any, error)

func (EmptyParser) ParseWithPrompt

func (EmptyParser) ParseWithPrompt(text string, _ Prompt) (any, error)

func (EmptyParser) Type

func (EmptyParser) Type() string

type Env

type Env interface {
	Bool(key string, fallback bool) bool
	Duration(key string, fallback time.Duration) time.Duration
	Float64(key string, fallback float64) float64
	Int(key string, fallback int) int
	String(key, fallback string) string
	Get(key string) string
}

Env is the ENV client interface

func NewEnv

func NewEnv(getenv Getenv) Env

NewEnv creates an Env backed by provided Getenv function (such as os.Getenv)

type FunctionCall

type FunctionCall struct {
	// The name of the function to call.
	Name string `json:"name"`
	// The arguments to pass to the function, as a JSON string.
	Arguments string `json:"arguments"`
}

FunctionCall is the name and arguments of a function call.

type FunctionCallBehavior

type FunctionCallBehavior string

FunctionCallBehavior is the behavior to use when calling functions.

const (
	// FunctionCallBehaviorNone will not call any functions.
	FunctionCallBehaviorNone FunctionCallBehavior = "none"
	// FunctionCallBehaviorAuto will call functions automatically.
	FunctionCallBehaviorAuto FunctionCallBehavior = "auto"
)

type FunctionDefinition

type FunctionDefinition struct {
	// Name is the name of the function.
	Name string `json:"name"`
	// Description is a description of the function.
	Description string `json:"description"`
	// Parameters is a list of parameters for the function.
	Parameters any `json:"parameters,omitempty"`
	// Strict is a flag to indicate if the function should be called strictly. Only used for openai llm structured output.
	Strict bool `json:"strict,omitempty"`
}

FunctionDefinition is a definition of a function that can be called by the model.

type FunctionReference

type FunctionReference struct {
	// Name is the name of the function.
	Name string `json:"name"`
}

FunctionReference is a reference to a function.

type GenericChatMessage

type GenericChatMessage struct {
	Content string
	Role    string
	Name    string
}

GenericChatMessage is a chat message with an arbitrary speaker.

func (GenericChatMessage) MessageContent

func (m GenericChatMessage) MessageContent() string

func (GenericChatMessage) MessageName

func (m GenericChatMessage) MessageName() string

func (GenericChatMessage) Type

type Getenv

type Getenv func(string) string

Getenv function for ENV variables

type HTTPDoer

type HTTPDoer interface {
	Do(req *http.Request) (*http.Response, error)
}

HTTPDoer interface.

type Hooks

Hooks is the interface that allows for hooking into specific parts of an LLM application.

type HumanChatMessage

type HumanChatMessage struct {
	Content string
}

HumanChatMessage is a message sent by a human.

func (HumanChatMessage) MessageContent

func (human HumanChatMessage) MessageContent() string

func (HumanChatMessage) Type

func (human HumanChatMessage) Type() ChatMessageType

type ImageURLContent

type ImageURLContent struct {
	URL    string `json:"url"`
	Detail string `json:"detail,omitempty"` // Detail is the detail of the image, e.g. "low", "high".
}

ImageURLContent is content with an URL pointing to an image.

func ImageURLPart

func ImageURLPart(url string) ImageURLContent

ImageURLPart creates a new ImageURLContent from the given URL.

func ImageURLWithDetailPart

func ImageURLWithDetailPart(url string, detail string) ImageURLContent

ImageURLWithDetailPart creates a new ImageURLContent from the given URL and detail.

func (ImageURLContent) String

func (iuc ImageURLContent) String() string

type Loader added in v0.0.5

type Loader interface {
	// Load loads from a source and returns documents.
	Load(ctx context.Context) ([]Document, error)
	// LoadAndSplit loads from a source and splits the documents using a text splitter.
	LoadAndSplit(ctx context.Context, splitter TextSplitter) ([]Document, error)
}

Loader is the interface for loading and splitting documents from a source.

type Memory

type Memory interface {
	// MemoryKey getter for memory key.
	MemoryKey(ctx context.Context) string
	// Variables Input keys this memory class will load dynamically.
	Variables(ctx context.Context) []string
	// LoadVariables Return key-value pairs given the text input to the chain.
	// If None, return all memories
	LoadVariables(ctx context.Context, inputs map[string]any) (map[string]any, error)
	// SaveContext Save the context of this model run to memory.
	SaveContext(ctx context.Context, inputs map[string]any, outputs map[string]any) error
	// Clear memory contents.
	Clear(ctx context.Context) error
}

Memory is the interface for memory in chains.

type Message

type Message struct {
	Role  ChatMessageType
	Parts []ContentPart
}

Message is the content of a message sent to a LLM. It has a role and a sequence of parts. For example, it can represent one message in a chat session sent by the user, in which case Role will be ChatMessageTypeHuman and Parts will be the sequence of items sent in this specific message.

func TextParts added in v0.0.4

func TextParts(role ChatMessageType, parts ...string) Message

TextParts is a helper function to create a Message with a role and a list of text parts.

func TextPartsMessage

func TextPartsMessage(role ChatMessageType, parts ...string) Message

TextPartsMessage is a helper function to create a Message with a role and a list of text parts.

type MessageFormatter

type MessageFormatter interface {
	FormatMessages(values map[string]any) ([]ChatMessage, error)
	InputVariables() []string
}

MessageFormatter is an interface for formatting a map of values into a list of messages.

type Named

type Named interface {
	MessageName() string
}

Named is an interface for objects that have a message name.

type ParseError

type ParseError struct {
	Text   string
	Reason string
}

ParseError is the error type returned by output parsers.

func (ParseError) Error

func (e ParseError) Error() string

type Parser

type Parser[T any] interface {
	// Parse parses the output of an LLM call.
	Parse(text string) (T, error)
	// FormatInstructions returns a string describing the format of the output.
	FormatInstructions() string
	// Type returns the string type key uniquely identifying this class of parser
	Type() string
}

Parser is an interface for parsing the output of an LLM call.

type ParserErrorHandler

type ParserErrorHandler struct {
	// The formatter function can be used to format the parsing error. If nil the error will be given
	// as an observation directly.
	Formatter func(err string) string
}

ParserErrorHandler is the struct used to handle parse errors from the agent in the executor. If an executor have a ParserErrorHandler, parsing errors will be formatted using the formatter function and added as an observation. In the next executor step the agent will then have the possibility to fix the error.

func NewParserErrorHandler

func NewParserErrorHandler(formatFunc func(string) string) *ParserErrorHandler

NewParserErrorHandler creates a new parser error handler.

type Prompt

type Prompt interface {
	String() string
	Messages() []ChatMessage
}

Prompt is the interface that all prompt values must implement.

type PromptFormatter

type PromptFormatter interface {
	FormatPrompt(values map[string]any) (Prompt, error)
	InputVariables() []string
}

PromptFormatter is an interface for formatting a map of values into a prompt.

type Provider

type Provider interface {
	// GenerateContent asks the model to generate content from a sequence of
	// messages. It's the most general interface for multi-modal LLMs that support
	// chat-like interactions.
	GenerateContent(ctx context.Context, messages []Message, options ...ContentOption) (*ContentResponse, error)
}

Provider is an interface all LLM providers must implement.

type ProviderHooks

type ProviderHooks interface {
	ProviderStart(ctx context.Context, prompts []string)
	ProviderGenerateContentStart(ctx context.Context, ms []Message)
	ProviderGenerateContentEnd(ctx context.Context, res *ContentResponse)
	ProviderError(ctx context.Context, err error)

	StreamingFunc(ctx context.Context, chunk []byte)
}

ProviderHooks contains the hooks that can be used by a provider.

type Renderer

type Renderer interface {
	RenderTemplate(tmpl string, values map[string]any) (string, error)
}

Renderer is used to convert a template string and a map of values into a final string.

type RendererFunc

type RendererFunc func(string, map[string]any) (string, error)

RendererFunc is a convenience function type that implements the Renderer interface by calling itself.

func (RendererFunc) RenderTemplate

func (render RendererFunc) RenderTemplate(tmpl string, values map[string]any) (string, error)

type Retriever

type Retriever interface {
	RelevantDocuments(ctx context.Context, query string) ([]Document, error)
}

Retriever is an interface that defines the behavior of a retriever.

type RetrieverHooks

type RetrieverHooks interface {
	RetrieverStart(ctx context.Context, query string)
	RetrieverEnd(ctx context.Context, query string, documents []Document)
}

RetrieverHooks contains the hooks that can be used by a retriever.

type String

type String string

String is a prompt value that is a string.

func (String) Messages

func (s String) Messages() []ChatMessage

Messages returns a single-element ChatMessage slice.

func (String) String

func (s String) String() string

type StringFormatter

type StringFormatter interface {
	FormatString(values map[string]any) (string, error)
}

StringFormatter is an interface for formatting a map of values into a string.

type SystemChatMessage

type SystemChatMessage struct {
	Content string
}

SystemChatMessage is a chat message representing information that should be instructions to the AI system.

func (SystemChatMessage) MessageContent

func (system SystemChatMessage) MessageContent() string

func (SystemChatMessage) Type

func (system SystemChatMessage) Type() ChatMessageType

type Template

type Template struct {
	// Content of the prompt template.
	Content string

	// Variables is a list of variable names the prompt template expects.
	Variables []string

	// Renderer used to generate strings based on the prompt template.
	Renderer

	// Parser is a function that parses the output of the prompt template.
	Parser Parser[any]

	// PartialVariables represents a map of variable names to values or functions
	// that return values. If the value is a function, it will be called when the
	// prompt template is rendered.
	PartialVariables map[string]any
}

Template contains common fields for all prompt templates.

func FStringTemplate

func FStringTemplate(content string, variables []string) Template

FStringTemplate constructs a TEmplate that renders using internal/fstring

func GoTemplate

func GoTemplate(content string, variables []string) Template

GoTemplate constructs a Template that renders using text/template

func NewTemplate

func NewTemplate(content string, variables []string, renderer Renderer, options ...TemplateOption) Template

func (Template) FormatPrompt

func (p Template) FormatPrompt(values map[string]any) (Prompt, error)

FormatPrompt formats the prompt template and returns a string prompt value.

func (Template) FormatString

func (p Template) FormatString(values map[string]any) (string, error)

Format formats the prompt template and returns a string value.

func (Template) InputVariables

func (p Template) InputVariables() []string

InputVariables returns the input variables the prompt expect.

type TemplateOption

type TemplateOption func(*Template)

func TemplateWithParser

func TemplateWithParser(parser Parser[any]) TemplateOption

func TemplateWithPartialVariables

func TemplateWithPartialVariables(partialVariables map[string]any) TemplateOption

type TextContent

type TextContent struct {
	Text string
}

TextContent is content with some text.

func TextPart

func TextPart(s string) TextContent

TextPart creates TextContent from a given string.

func (TextContent) String

func (tc TextContent) String() string

type TextSplitter

type TextSplitter interface {
	SplitText(text string) ([]string, error)
}

TextSplitter is the standard interface for splitting texts.

type Tool

type Tool struct {
	// Type is the type of the tool.
	Type string `json:"type"`
	// Function is the function to call.
	Function *FunctionDefinition `json:"function,omitempty"`
}

Tool is a tool that can be used by the model.

type ToolCall

type ToolCall struct {
	// ID is the unique identifier of the tool call.
	ID string `json:"id"`
	// Type is the type of the tool call. Typically, this would be "function".
	Type string `json:"type"`
	// FunctionCall is the function call to be executed.
	FunctionCall *FunctionCall `json:"function,omitempty"`
}

ToolCall is a call to a tool (as requested by the model) that should be executed.

type ToolCallResponse

type ToolCallResponse struct {
	// ToolCallID is the ID of the tool call this response is for.
	ToolCallID string `json:"tool_call_id"`
	// Name is the name of the tool that was called.
	Name string `json:"name"`
	// Content is the textual content of the response.
	Content string `json:"content"`
}

ToolCallResponse is the response returned by a tool call.

type ToolChatMessage

type ToolChatMessage struct {
	// MessageID is the id of the tool call.
	CallID string `json:"tool_call_id"`
	// Content is the content of the tool message.
	Content string `json:"content"`
}

ToolChatMessage is a chat message representing the result of a tool call.

func (ToolChatMessage) ID

func (tool ToolChatMessage) ID() string

func (ToolChatMessage) MessageContent

func (tool ToolChatMessage) MessageContent() string

func (ToolChatMessage) Type

func (tool ToolChatMessage) Type() ChatMessageType

type ToolChoice

type ToolChoice struct {
	// Type is the type of the tool.
	Type string `json:"type"`
	// Function is the function to call (if the tool is a function).
	Function *FunctionReference `json:"function,omitempty"`
}

ToolChoice is a specific tool to use.

type ToolHooks

type ToolHooks interface {
	ToolStart(ctx context.Context, input string)
	ToolEnd(ctx context.Context, output string)
	ToolError(ctx context.Context, err error)
}

ToolHooks contains the hooks that can be used by a tool.

type VectorStore

type VectorStore interface {
	AddDocuments(ctx context.Context, docs []Document, options ...VectorStoreOption) ([]string, error)
	SimilaritySearch(ctx context.Context, query string, numDocs int, options ...VectorStoreOption) ([]Document, error)
}

VectorStore is the interface for saving and querying documents in the form of vector embeddings.

type VectorStoreOption

type VectorStoreOption func(*VectorStoreOptions)

VectorStoreOption is a function that configures a VectorStoreOptions.

func VectorStoreWithDeduplicater

func VectorStoreWithDeduplicater(fn func(ctx context.Context, doc Document) bool) VectorStoreOption

VectorStoreWithDeduplicater returns a VectorStoreOption for setting the deduplicater that could be used when adding documents. This is useful to prevent wasting time on creating an embedding when one already exists.

func VectorStoreWithEmbedder

func VectorStoreWithEmbedder(embedder Embedder) VectorStoreOption

VectorStoreWithEmbedder returns a VectorStoreOption for setting the embedder that could be used when adding documents or doing similarity search (instead the embedder from the Store context) this is useful when we are using multiple LLMs with single vectorstore.

func VectorStoreWithFilters

func VectorStoreWithFilters(filters any) VectorStoreOption

VectorStoreWithFilters searches can be limited based on metadata filters. Searches with metadata filters retrieve exactly the number of nearest-neighbors results that match the filters. In most cases the search latency will be lower than unfiltered searches See https://docs.pinecone.io/docs/metadata-filtering

func VectorStoreWithNameSpace

func VectorStoreWithNameSpace(namespace string) VectorStoreOption

VectorStoreWithNameSpace returns a VectorStoreOption for setting the name space.

func VectorStoreWithScoreThreshold

func VectorStoreWithScoreThreshold(scoreThreshold float32) VectorStoreOption

type VectorStoreOptions

type VectorStoreOptions struct {
	Namespace      string
	ScoreThreshold float32
	Filters        any
	Embedder       Embedder
	Deduplicater   func(context.Context, Document) bool
}

VectorStoreOptions is a set of options for similarity search and add documents.

type VectorStoreRetriever

type VectorStoreRetriever struct {
	Hooks RetrieverHooks
	// contains filtered or unexported fields
}

VectorStoreRetriever is a retriever for vector stores.

func NewVectorStoreRetriever

func NewVectorStoreRetriever(vs VectorStore, numDocs int, options ...VectorStoreRetrieverOption) VectorStoreRetriever

NewVectorStoreRetriever takes a vector store and returns a retriever using the vector store to retrieve documents.

func (VectorStoreRetriever) RelevantDocuments

func (r VectorStoreRetriever) RelevantDocuments(ctx context.Context, query string) ([]Document, error)

RelevantDocuments returns documents using the vector store.

type VectorStoreRetrieverOption

type VectorStoreRetrieverOption func(*VectorStoreRetriever)

VectorStoreRetrieverOption is a function that configures a VectorStoreRetriever.

func VectorStoreRetrieverWithHooks

func VectorStoreRetrieverWithHooks(hooks RetrieverHooks) VectorStoreRetrieverOption

func VectorStoreRetrieverWithOptions

func VectorStoreRetrieverWithOptions(options ...VectorStoreOption) VectorStoreRetrieverOption

Directories

Path Synopsis
chains
hooks
internal
loaders
memory
parsers
prompts
providers
splitters
vectorstores
qdrantstore
Package qdrantstore contains an implementation of the llm.VectorStore interface using Qdrant.
Package qdrantstore contains an implementation of the llm.VectorStore interface using Qdrant.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL