bigquery

package
v0.3.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jun 14, 2025 License: MIT Imports: 20 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func NewBigQueryDestination

func NewBigQueryDestination(name string, config *config.BaseConfig) (core.Destination, error)

NewBigQueryDestination creates a new BigQuery destination connector

Types

type BigQueryDestination

type BigQueryDestination struct {
	*base.BaseConnector
	// contains filtered or unexported fields
}

BigQueryDestination is a high-performance BigQuery destination connector

func (*BigQueryDestination) AlterSchema

func (b *BigQueryDestination) AlterSchema(ctx context.Context, oldSchema, newSchema *core.Schema) error

AlterSchema alters the schema for the destination

func (*BigQueryDestination) BeginTransaction

func (b *BigQueryDestination) BeginTransaction(ctx context.Context) (core.Transaction, error)

BeginTransaction begins a new transaction (not supported by BigQuery)

func (*BigQueryDestination) BulkLoad

func (b *BigQueryDestination) BulkLoad(ctx context.Context, reader interface{}, format string) error

BulkLoad loads data in bulk (not implemented yet)

func (*BigQueryDestination) Close

func (b *BigQueryDestination) Close(ctx context.Context) error

Close closes the BigQuery destination connector

func (*BigQueryDestination) CreateSchema

func (b *BigQueryDestination) CreateSchema(ctx context.Context, schema *core.Schema) error

CreateSchema creates or updates the schema for the destination

func (*BigQueryDestination) DropSchema

func (b *BigQueryDestination) DropSchema(ctx context.Context, schema *core.Schema) error

DropSchema removes the table

func (*BigQueryDestination) GetStats

func (b *BigQueryDestination) GetStats() *BigQueryStats

GetStats returns real-time statistics

func (*BigQueryDestination) Initialize

func (b *BigQueryDestination) Initialize(ctx context.Context, config *config.BaseConfig) error

Initialize initializes the BigQuery destination connector

func (*BigQueryDestination) SupportsBatch

func (b *BigQueryDestination) SupportsBatch() bool

SupportsBatch returns true since BigQuery supports batch operations

func (*BigQueryDestination) SupportsBulkLoad

func (b *BigQueryDestination) SupportsBulkLoad() bool

SupportsBulkLoad returns true since BigQuery supports bulk loading

func (*BigQueryDestination) SupportsStreaming

func (b *BigQueryDestination) SupportsStreaming() bool

SupportsStreaming returns true since this connector supports streaming

func (*BigQueryDestination) SupportsTransactions

func (b *BigQueryDestination) SupportsTransactions() bool

SupportsTransactions returns false since BigQuery doesn't support traditional transactions

func (*BigQueryDestination) SupportsUpsert

func (b *BigQueryDestination) SupportsUpsert() bool

SupportsUpsert returns true since BigQuery supports MERGE operations

func (*BigQueryDestination) Upsert

func (b *BigQueryDestination) Upsert(ctx context.Context, records []*models.Record, keys []string) error

Upsert performs upsert operations using MERGE statement (not implemented yet)

func (*BigQueryDestination) Write

func (b *BigQueryDestination) Write(ctx context.Context, stream *core.RecordStream) error

Write writes a stream of records to BigQuery

func (*BigQueryDestination) WriteBatch

func (b *BigQueryDestination) WriteBatch(ctx context.Context, stream *core.BatchStream) error

WriteBatch writes batches of records to BigQuery

type BigQueryStats

type BigQueryStats struct {
	RecordsWritten    int64     `json:"records_written"`
	BytesWritten      int64     `json:"bytes_written"`
	BatchesProcessed  int64     `json:"batches_processed"`
	ErrorsEncountered int64     `json:"errors_encountered"`
	AverageLatency    int64     `json:"average_latency_ms"`
	StreamingEnabled  bool      `json:"streaming_enabled"`
	LastBatchTime     time.Time `json:"last_batch_time"`
}

BigQueryStats provides real-time statistics

type RecordBatch

type RecordBatch struct {
	Records   []*models.Record
	Timestamp time.Time
	BatchID   string
}

RecordBatch represents a batch of records to be inserted

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL