llmkit

package module
v0.2.14 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Oct 18, 2025 License: MIT Imports: 10 Imported by: 0

README

LLM Kit

Minimal Go library for calling LLM APIs using only the standard library - no external dependencies required.

Providers

  • Anthropic Claude - Chat completions and structured output
  • OpenAI GPT - Chat completions and structured output
  • Google Gemini - Chat completions and structured output
  • xAI Grok - Chat completions and structured output

Structure

├── cmd/                    # Command-line interfaces
│   ├── llmkit-anthropic/   # Anthropic CLI
│   ├── llmkit-openai/      # OpenAI CLI
│   ├── llmkit-google/      # Google CLI
│   └── llmkit-grok/        # Grok CLI
├── anthropic/              # Anthropic (Claude) API package
│   ├── prompt.go           # API implementation
│   └── README.md           # Usage examples
├── openai/                 # OpenAI API package
│   ├── prompt.go           # API implementation
│   └── README.md           # Usage examples
├── google/                 # Google (Gemini) API package
│   ├── prompt.go           # API implementation
│   └── README.md           # Usage examples
├── grok/                   # xAI (Grok) API package
│   ├── prompt.go           # API implementation
│   └── README.md           # Usage examples
├── docs/                   # API documentation
├── examples/               # Example JSON schemas
└── errors.go               # Structured error types

Installation

Install using Homebrew:

brew install aktagon/llmkit/llmkit

This installs the llmkit binary and all provider-specific CLI tools (llmkit-anthropic, llmkit-openai, llmkit-google, llmkit-grok).

Install CLI Tools

Install the command-line tools globally:

# Install Anthropic CLI
go install github.com/aktagon/llmkit/cmd/llmkit-anthropic@latest

# Install OpenAI CLI
go install github.com/aktagon/llmkit/cmd/llmkit-openai@latest

# Install Google CLI
go install github.com/aktagon/llmkit/cmd/llmkit-google@latest

# Install Grok CLI
go install github.com/aktagon/llmkit/cmd/llmkit-grok@latest

Make sure your $GOPATH/bin is in your $PATH to use the installed binaries:

export PATH=$PATH:$(go env GOPATH)/bin

Check installation location:

# See where Go installs binaries
echo $(go env GOPATH)/bin

# List installed llmkit tools
ls -la $(go env GOPATH)/bin/llmkit-*
Use as Library

Add to your Go project:

go get github.com/aktagon/llmkit

Quick Start

Anthropic

Using installed CLI:

export ANTHROPIC_API_KEY="your-key"
llmkit-anthropic "You are helpful" "Hello Claude"

Using go run:

export ANTHROPIC_API_KEY="your-key"
go run cmd/llmkit-anthropic/main.go "You are helpful" "Hello Claude"

Structured output:

llmkit-anthropic \
  "You are an expert at structured data extraction." \
  "What's the weather like in San Francisco? I prefer Celsius." \
  "$(cat examples/openai/schemas/weather-schema.json)"
OpenAI

Using installed CLI:

export OPENAI_API_KEY="your-key"
llmkit-openai "You are helpful" "Hello GPT"

Structured output:

llmkit-openai \
  "You are an expert at structured data extraction." \
  "What's the weather like in San Francisco? I prefer Celsius." \
  "$(cat examples/openai/schemas/weather-schema.json)"
Google

Using installed CLI:

export GOOGLE_API_KEY="your-key"
llmkit-google "You are helpful" "Hello Gemini"

Structured output:

llmkit-google \
  "You are an expert at structured data extraction." \
  "What's the weather like in San Francisco? I prefer Celsius." \
  "$(cat examples/google/schemas/weather-schema.json)"
Grok

Using installed CLI:

export XAI_API_KEY="your-key"
llmkit-grok "You are helpful" "Hello Grok"

Structured output:

llmkit-grok \
  "You are an expert at structured data extraction." \
  "What's the weather like in San Francisco? I prefer Celsius." \
  "$(cat examples/grok/schemas/weather-schema.json)"

API

Simple Prompting
package main

import (
    "fmt"
    "log"
    "os"

    "github.com/aktagon/llmkit"
)

func main() {
    // Works with any provider
    response, err := llmkit.Prompt(llmkit.PromptOptions{
        Provider:     llmkit.ProviderOpenAI,
        SystemPrompt: "You are a helpful assistant",
        UserPrompt:   "What is the capital of France?",
        APIKey:       os.Getenv("OPENAI_API_KEY"),
    })
    if err != nil {
        log.Fatal(err)
    }
    fmt.Println("Response:", response)
}
Conversational Agents

For multi-turn conversations with memory and tools:

package main

import (
    "fmt"
    "log"
    "os"

    "github.com/aktagon/llmkit"
)

func main() {
    // Create conversational agent
    agent, err := llmkit.Agent(llmkit.ProviderOpenAI, os.Getenv("OPENAI_API_KEY"))
    if err != nil {
        log.Fatal(err)
    }

    // Chat with memory
    response, err := agent.Chat("Hello! My name is John.")
    if err != nil {
        log.Fatal(err)
    }
    fmt.Println("Agent:", response)

    // Continue conversation
    response, err = agent.Chat("What's my name?")
    if err != nil {
        log.Fatal(err)
    }
    fmt.Println("Agent:", response)
}
Structured Output
package main

import (
    "fmt"
    "log"
    "os"

    "github.com/aktagon/llmkit"
)

func main() {
    schema := `{
        "name": "weather_info",
        "description": "Weather information extraction",
        "strict": true,
        "schema": {
            "type": "object",
            "properties": {
                "location": {"type": "string"},
                "temperature": {"type": "number"},
                "unit": {"type": "string", "enum": ["C", "F"]}
            },
            "required": ["location", "temperature", "unit"],
            "additionalProperties": false
        }
    }`

    response, err := llmkit.Prompt(llmkit.PromptOptions{
        Provider:     llmkit.ProviderOpenAI,
        SystemPrompt: "You are a weather assistant.",
        UserPrompt:   "What's the weather in Tokyo? Use Celsius.",
        JSONSchema:   schema,
        APIKey:       os.Getenv("OPENAI_API_KEY"),
    })
    if err != nil {
        log.Fatal(err)
    }

    fmt.Println(response)
}

Testing

Mock LLM responses for testing:

func TestMyFunction(t *testing.T) {
    mock := llmkit.NewMockClient()
    mock.PromptFunc = func(ctx context.Context, opts llmkit.PromptOptions) (string, error) {
        return "mocked response", nil
    }
    
    result, err := myFunction(mock)
    // ... test assertions
}

Client interface for dependency injection:

func myFunction(client llmkit.Client) (string, error) {
    return client.Prompt(ctx, opts)
}

// Production
client := llmkit.NewClient("api-key")

// Testing  
client := llmkit.NewMockClient()

For conversational agents, use concrete types directly:

import "github.com/aktagon/llmkit/openai/agents"

agent, err := agents.New("api-key")
response, err := agent.Chat("Hello")

Features

  • Standard chat completions
  • Structured output with JSON schema validation/support
  • Pure Go standard library implementation
  • Command-line interface
  • Structured error types for better error handling
  • Programmatic API for library usage
  • HTTP request/response logging for debugging and monitoring

Error Handling

The library provides structured error types:

  • APIError - Errors from LLM APIs
  • ValidationError - Input validation errors
  • RequestError - Request building/sending errors
  • SchemaError - JSON schema validation errors
response, err := openai.Prompt(systemPrompt, userPrompt, schema, apiKey)
if err != nil {
    switch e := err.(type) {
    case *errors.APIError:
        fmt.Printf("API error: %s (status %d)\n", e.Message, e.StatusCode)
    case *errors.SchemaError:
        fmt.Printf("Schema validation error: %s\n", e.Message)
    case *errors.ValidationError:
        fmt.Printf("Input validation error: %s\n", e.Message)
    default:
        fmt.Printf("Unknown error: %v\n", err)
    }
    return
}

HTTP Logging

Enable request/response logging for debugging:

Environment variables:

export LLMKIT_LOG_HTTP=true
export LLMKIT_LOG_LEVEL=info    # or debug for request/response bodies

Configuration file (llmkit.yaml):

logging:
  http: true
  level: info

Example output:

2025/08/13 17:05:53 INFO HTTP request provider=anthropic method=POST url=https://api.anthropic.com/v1/messages duration=10.233s status=200

Each provider directory contains detailed examples and usage instructions.

Support

Commercial support is available. Contact christian@aktagon.com.

License

MIT


Interested in AI-powered workflow automation for your company? Get started: https://aktagon.com | contact@aktagon.com

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func Prompt added in v0.2.1

func Prompt(opts PromptOptions) (string, error)

Prompt sends a prompt request to the specified LLM provider

func PromptAnthropic added in v0.2.1

func PromptAnthropic(systemPrompt, userPrompt, jsonSchema, apiKey string) (string, error)

PromptAnthropic is a convenience function for Anthropic prompts

func PromptGoogle added in v0.2.1

func PromptGoogle(systemPrompt, userPrompt, jsonSchema, apiKey string) (string, error)

PromptGoogle is a convenience function for Google prompts

func PromptGrok added in v0.2.14

func PromptGrok(systemPrompt, userPrompt, jsonSchema, apiKey string) (string, error)

PromptGrok is a convenience function for Grok prompts

func PromptOpenAI added in v0.2.1

func PromptOpenAI(systemPrompt, userPrompt, jsonSchema, apiKey string) (string, error)

PromptOpenAI is a convenience function for OpenAI prompts

Types

type Client added in v0.2.11

type Client interface {
	Prompt(ctx context.Context, opts PromptOptions) (string, error)
}

Client interface enables mocking and testing

func NewClient added in v0.2.11

func NewClient(apiKey string) Client

NewClient creates a new client with default HTTP client

func NewClientWithHTTPClient added in v0.2.11

func NewClientWithHTTPClient(apiKey string, httpClient *http.Client) Client

NewClientWithHTTPClient creates a new client with custom HTTP client

type ConversationalAgent added in v0.2.5

type ConversationalAgent interface {
	Chat(message string, opts ...interface{}) (interface{}, error)
	RegisterTool(tool interface{}) error
	Reset(clearMemory bool) error
	Remember(key, value string) error
	Recall(key string) (string, bool)
}

ConversationalAgent represents a conversational agent

func Agent added in v0.2.5

func Agent(provider Provider, apiKey string) (ConversationalAgent, error)

Agent creates a new conversational agent for the specified provider

type MockClient added in v0.2.11

type MockClient struct {
	PromptFunc func(ctx context.Context, opts PromptOptions) (string, error)
}

MockClient implements Client interface for testing

func NewMockClient added in v0.2.11

func NewMockClient() *MockClient

NewMockClient creates a new mock client with default behavior

func (*MockClient) Prompt added in v0.2.11

func (m *MockClient) Prompt(ctx context.Context, opts PromptOptions) (string, error)

Prompt delegates to the mock function

type PromptOptions added in v0.2.1

type PromptOptions struct {
	Provider     Provider        // Which LLM provider to use
	SystemPrompt string          // System prompt for the request
	UserPrompt   string          // User prompt for the request
	JSONSchema   string          // Optional JSON schema for structured output
	APIKey       string          // API key for the provider
	MaxTokens    int             // Maximum tokens in response (0 = omit from request)
	Temperature  float64         // Sampling temperature (0 = omit from request)
	Files        []internal.File // Optional file attachments
}

PromptOptions configures the prompt request

type Provider added in v0.2.1

type Provider string

Provider represents the LLM provider type

const (
	ProviderOpenAI    Provider = "openai"
	ProviderAnthropic Provider = "anthropic"
	ProviderGoogle    Provider = "google"
	ProviderGrok      Provider = "grok"
)

Directories

Path Synopsis
cmd
llmkit command
llmkit-google command
llmkit-grok command
llmkit-openai command
tools command
examples
anthropic/files command
google/files command
google/proofs command
grok/chat command
grok/prompt command
grok/weather command
openai/files command
workflows/simple_workflow command
Example program demonstrating the workflow package
Example program demonstrating the workflow package
Package workflow provides a minimal API for creating and executing workflows consisting of tasks with conditional transitions.
Package workflow provides a minimal API for creating and executing workflows consisting of tasks with conditional transitions.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL