Documentation
¶
Overview ¶
Package goras provides a higher level API to create and train neural networks with gorgonia.
The stuff in this file is just some stuff to make working with tensors easier. There is a pretty good chance that this stuff is already in gorgonia, but I couldn't find it after all 2 seconds of looking I did.
Index ¶
- Variables
- func GetTensorDataType(t interface{}) (tensor.Dtype, error)
- func Make1DSliceTensor[T any](data []T) (tensor.Tensor, error)
- func Make2DSliceTensor[T any](data [][]T) (tensor.Tensor, error)
- func MustGetTensorDataType(t interface{}) tensor.Dtype
- func MustMake1DSliceTensor[T any](data []T) tensor.Tensor
- func MustMake2DSliceTensor[T any](data [][]T) tensor.Tensor
- func NewNamer(baseName string) func() string
- type ActivationLayer
- func Activation(m *Model, name string, activation string) *ActivationLayer
- func Binary(m *Model, name string) *ActivationLayer
- func LeakyRelu(m *Model, name string, grad ...float64) *ActivationLayer
- func Relu(m *Model, name string) *ActivationLayer
- func Sigmoid(m *Model, name string) *ActivationLayer
- func Softmax(m *Model, name string) *ActivationLayer
- func Tanh(m *Model, name string) *ActivationLayer
- type BuildOpts
- type Conv2DLayer
- type DenseLayer
- type DropoutLayer
- type EpochCallback
- type FitOpt
- type InputLayer
- type Layer
- type LayerBase
- type LossFunc
- type MaxPooling2DLayer
- type Model
- func (m *Model) AddLayer(l Layer)
- func (m *Model) BindParamsFrom(m1 *Model) error
- func (m *Model) Build(opts ...BuildOpts) error
- func (m *Model) CopyParamsFrom(m1 *Model) error
- func (m *Model) Fit(xs, ys map[string]T.Tensor, solver G.Solver, opts ...FitOpt) error
- func (m *Model) FitBatch(inputs, lossRequirements map[string]T.Tensor, solver G.Solver) (float64, error)
- func (m *Model) FitGenerator(tdg TrainingDataGenerator, solver G.Solver, opts ...FitOpt) error
- func (m *Model) GetParams() map[string]*T.Dense
- func (m *Model) MustBindParamsFrom(m1 *Model)
- func (m *Model) MustBuild(opts ...BuildOpts)
- func (m *Model) MustCopyParamsFrom(m1 *Model)
- func (m *Model) MustFit(xs, ys map[string]T.Tensor, solver G.Solver, opts ...FitOpt)
- func (m *Model) MustFitBatch(inputs, lossRequirements map[string]T.Tensor, solver G.Solver) float64
- func (m *Model) MustFitGenerator(tdg TrainingDataGenerator, solver G.Solver, opts ...FitOpt)
- func (m *Model) MustPredict(xs map[string]T.Tensor) map[string]T.Tensor
- func (m *Model) MustPredictBatch(inputs map[string]T.Tensor) map[string]T.Tensor
- func (m *Model) MustReadParams(r io.Reader)
- func (m *Model) MustSetParams(params map[string]*T.Dense)
- func (m *Model) MustWriteParams(w io.Writer)
- func (m *Model) Predict(xs map[string]T.Tensor) (map[string]T.Tensor, error)
- func (m *Model) PredictBatch(inputs map[string]T.Tensor) (map[string]T.Tensor, error)
- func (m *Model) ReadParams(r io.Reader) error
- func (m *Model) SetParams(params map[string]*T.Dense) error
- func (m *Model) Summary() string
- func (m *Model) Trainables() G.Nodes
- func (m *Model) WriteParams(w io.Writer) error
- type NamedTs
- type OneHotLayer
- type ReshapeLayer
- type TensorTrainingDataGenerator
- type TrainingDataGenerator
Constants ¶
This section is empty.
Variables ¶
var ImageUtils imageUtils = imageUtils{}
ImageUtils is a struct that contains functions that are not core to goras, but are useful for image manipulation.
Functions ¶
func GetTensorDataType ¶ added in v0.2.0
func Make1DSliceTensor ¶ added in v0.3.0
Make1DSliceTensor converts a 1D slice to a tensor.
func Make2DSliceTensor ¶ added in v0.3.0
Make2DSliceTensor converts a 2D slice to a tensor. The slice is indexed[row][column].
func MustGetTensorDataType ¶ added in v0.3.3
MustGetTensorDataType calls GetTensorDataType and panics if there is an error.
func MustMake1DSliceTensor ¶ added in v0.3.3
MustMake1DSliceTensor calls Make1DSliceTensor and panics if there is an error.
func MustMake2DSliceTensor ¶ added in v0.3.3
MustMake2DSliceTensor calls Make2DSliceTensor and panics if there is an error.
Types ¶
type ActivationLayer ¶
ActivationLayer is a layer that applies an activation function to its input.
- Input/Output Shape: any shape
func Activation ¶
func Activation(m *Model, name string, activation string) *ActivationLayer
Activation creates a new ActivationLayer on the Model with the given activation function. The activation function can be one of ["sigmoid", "relu", "tanh", "binary", "softmax", "leakyrelu"].
func Binary ¶ added in v0.0.4
func Binary(m *Model, name string) *ActivationLayer
Binary creates a new ActivationLayer on the Model with the binary activation function.
func LeakyRelu ¶ added in v0.0.4
func LeakyRelu(m *Model, name string, grad ...float64) *ActivationLayer
LeakyRelu creates a new ActivationLayer on the Model with the leaky relu activation function. You can optionally specify the negative gradient (LeakyRely(model, name, grad)). If you don't, it will default to 0.01.
func Relu ¶ added in v0.0.4
func Relu(m *Model, name string) *ActivationLayer
Relu creates a new ActivationLayer on the Model with the relu activation function.
func Sigmoid ¶ added in v0.0.4
func Sigmoid(m *Model, name string) *ActivationLayer
Sigmoid creates a new ActivationLayer on the Model with the sigmoid activation function.
func Softmax ¶ added in v0.0.4
func Softmax(m *Model, name string) *ActivationLayer
Softmax creates a new ActivationLayer on the Model with the softmax activation function.
func Tanh ¶ added in v0.0.4
func Tanh(m *Model, name string) *ActivationLayer
Tanh creates a new ActivationLayer on the Model with the tanh activation function.
func (*ActivationLayer) MustAttach ¶ added in v0.0.3
func (l *ActivationLayer) MustAttach(n *G.Node) *G.Node
MustAttach attaches this layer to a previous node. It panics on error.
func (*ActivationLayer) Parameters ¶
func (l *ActivationLayer) Parameters() map[string]*G.Node
Parameters returns a map of the parameters of the layer.
type BuildOpts ¶ added in v0.0.4
type BuildOpts func(*buildParams)
BuildOpts are options for the Build method.
func WithInput ¶ added in v0.2.0
WithInput adds an input node to the model.
- inputName: The name we will use to pass tensors to this node. This must be unique, and will be used later in fit and predict methods.
- inputNode: The node to use as the input. This is usually from a goras.Input layer.
func WithOutput ¶ added in v0.2.0
WithOutput adds an output node to the model.
- outputName: The name we will use to get tensors from this node. This must be unique, and will be used later in fit and predict methods.
- outputNode: The node to use as the output.
type Conv2DLayer ¶ added in v0.0.4
type Conv2DLayer struct { LayerBase Kernels *G.Node KernelSize []int NumKernels int Stride []int Padding string }
Conv2DLayer is a 2D convolutional layer.
- Input Shape: (batch_size, previous_kernels/previous_channels, img_width, img_height)
- Output Shape: (batch_size, num_kernels, img_width, img_height)
func Conv2D ¶ added in v0.0.4
func Conv2D(m *Model, name string, kernelShape, stride []int, padding string, numKernels int) *Conv2DLayer
Conv2D is a constructor to create a 2D convolutional layer. Options for padding are "same" or "valid".
func SimpleConv2D ¶ added in v0.0.4
func SimpleConv2D(m *Model, name string, kernelSize int, numKernels int) *Conv2DLayer
SimpleConv2D is a constructor to create a 2D convolutional layer. It has a kernel shape of [kernelSize, kernelSize], a stride of [1, 1], and padding of "same". This means that the output will be the same shape as the input.
func (*Conv2DLayer) MustAttach ¶ added in v0.0.4
func (l *Conv2DLayer) MustAttach(n *G.Node) *G.Node
MustAttach attaches this layer to a previous node. It panics on error.
func (*Conv2DLayer) Parameters ¶ added in v0.0.4
func (l *Conv2DLayer) Parameters() map[string]*G.Node
Parameters returns a map of the parameters of the layer.
type DenseLayer ¶
DenseLayer is a layer that performs a dense (fully connected) operation. It does not perform any activation or dropout.
- Input Shape: (batch_size, num_inputs)
- Output Shape: (batch_size, num_nodes)
func Dense ¶
func Dense(m *Model, name string, nodes int) *DenseLayer
Dense creates a new dense layer on the specified model.
func (*DenseLayer) MustAttach ¶ added in v0.0.3
func (l *DenseLayer) MustAttach(n *G.Node) *G.Node
MustAttach attaches the layer to a previous node, panicking on error.
func (*DenseLayer) Parameters ¶
func (l *DenseLayer) Parameters() map[string]*G.Node
Parameters returns a map of the parameters of the layer.
type DropoutLayer ¶
DropoutLayer is a dropout layer.
- Input/Output Shape: any shape
func Dropout ¶
func Dropout(m *Model, name string, dropoutProbability float64) *DropoutLayer
Dropout creates a new DropoutLayer on the Model with the given dropout probability.
func (*DropoutLayer) MustAttach ¶ added in v0.0.3
func (l *DropoutLayer) MustAttach(n *G.Node) *G.Node
MustAttach attaches the DropoutLayer to the given node. It panics on error.
func (*DropoutLayer) Parameters ¶
func (d *DropoutLayer) Parameters() map[string]*G.Node
Parameters returns a map of the parameters of the layer.
type EpochCallback ¶ added in v0.0.5
func RepeatedSaveModelParametersCallback ¶ added in v0.0.5
func RepeatedSaveModelParametersCallback(model *Model, pathWithFormat string, every int) EpochCallback
RepeatedSaveModelParametersCallback saves the model parameters to the given path. It saves the model every `every` epochs, so you get multiple models. The path should contain a %v format specifier, which will be replaced with the epoch number.
func SaveModelParametersCallback ¶ added in v0.0.5
func SaveModelParametersCallback(model *Model, path string) EpochCallback
SaveModelParametersCallback saves the model parameters to the given path. It overwrites the file at the given path each epoch, so you only get the most recent model.
type FitOpt ¶ added in v0.0.3
type FitOpt func(*fitParams)
FitOpts are options for the Fit method.
func WithClearLine ¶ added in v0.0.3
WithClearLine sets whether to clear the line when logging the loss.
func WithEpochCallback ¶ added in v0.0.5
func WithEpochCallback(cb EpochCallback) FitOpt
WithEpochCallback adds a callback to be called at the end of each epoch.
func WithEpochs ¶ added in v0.0.3
WithEpochs sets the number of epochs to train for.
func WithLoggingEvery ¶ added in v0.0.3
WithLoggingEvery sets how often to log the loss.
func WithVerbose ¶ added in v0.0.3
WithVerbose sets whether to log the loss.
type InputLayer ¶
type InputLayer struct {
LayerBase
}
InputLayer is a layer that takes an input of a specific shape.
- Input/Output Shape: (batch_size, ...other_dims) [the specified shape]
func Input ¶
Input creates a new input layer on the specified model. To access the resulting *Node, use the .Node() function.
func (*InputLayer) Parameters ¶
func (l *InputLayer) Parameters() map[string]*G.Node
Parameters returns a map of the parameters of the layer.
type Layer ¶
type Layer interface { Parameters() map[string]*G.Node // This returns a map of the parameters. E.g. {"weights":[...], "biases":[...]} Name() string // This returns a name unique to this layer in the model Trainable() bool // This specifies whether the layer is updated during Fit() Type() string // This is used for Summary() Node() *G.Node // This returns the node used as the main output for this layer INodes() []*G.Node // This returns all nodes used as inputs to this layer }
Layer is an interface that all layers must implement to be able to be added to a model.
type LayerBase ¶
type LayerBase struct { Graph *G.ExprGraph LayerName string LayerType string IsTrainable bool OutputNode *G.Node InputNodes []*G.Node }
LayerBase is a struct that all layers should embed. It provides some useful shared fields and methods.
func (*LayerBase) Node ¶ added in v0.0.5
Node returns the final node in this layer (the output node)
type LossFunc ¶ added in v0.2.0
LossFunc is a function that when called, returns:
- a node (loss output scalar)
- a map of nodes which the loss requires to be created (for instance, this is usually the target for the output layer)
- an error
func BCELoss ¶ added in v0.2.0
BCE creates the nodes to calculate binary crossentropy loss between a predicted and target node. It should be used when using Model.Build().
func MSELoss ¶ added in v0.2.0
MSE creates the nodes to calculate mean squared error loss between a predicted and target node. It should be used when using Model.Build().
func WeightedAdditiveLoss ¶ added in v0.2.0
KNOWN BUG: I'm pretty certain this will not work if the graph is using float32s, because all the weights are float64
type MaxPooling2DLayer ¶ added in v0.0.4
MaxPooling2DLayer is a max pooling layer.
- Input Shape: (batch_size, num_channels, img_height, img_width)
- Output Shape: (batch_size, num_channels, img_height, img_width) [img_height and img_width will be smaller than the input]
func MaxPooling2D ¶ added in v0.0.4
func MaxPooling2D(m *Model, name string, poolSize, stride []int, padding string) *MaxPooling2DLayer
MaxPooling2D creates a new max pooling layer on the specified model. Padding can be either "same" or "valid".
func SimpleMaxPooling2D ¶ added in v0.0.4
func SimpleMaxPooling2D(m *Model, name string, poolSize int) *MaxPooling2DLayer
SimpleMaxPooling2D creates a new max pooling layer on the specified model. It will have padding=same stride=poolSize, and it is the same in both dims.
func (*MaxPooling2DLayer) Attach ¶ added in v0.0.4
Attach attaches the MaxPooling2DLayer to the given node.
func (*MaxPooling2DLayer) MustAttach ¶ added in v0.0.4
func (l *MaxPooling2DLayer) MustAttach(n *G.Node) *G.Node
MustAttach attaches the MaxPooling2DLayer to the given node.
func (*MaxPooling2DLayer) Parameters ¶ added in v0.0.4
func (l *MaxPooling2DLayer) Parameters() map[string]*G.Node
Parameters returns a map of the parameters of the layer.
type Model ¶
type Model struct { Graph *G.ExprGraph Layers []Layer Machine G.VM InputNodes map[string]*G.Node OutputNodes map[string]*G.Node OutputValues map[string]*G.Value // This is deliberately a ref because i think maps are scary LossValue G.Value LossRequiredNodes map[string]*G.Node }
Model is the core primitive of goras. It is effectively a wrapper around a Gorgonia graph, with extra functionality.
func (*Model) AddLayer ¶
AddLayer adds a layer to the model. You usually don't need to call this directly, as the layer constructors do it for you.
func (*Model) BindParamsFrom ¶
BindParamsFrom binds the parameters in the model m1 to the parameters in this model m, meaning layers with the same name will share the same tensors. This is a bit of a hack to allow two models to train the same weights. This can be called multiple times, where later binds may override earlier ones. For example, if you are making an autoencoder, you would have one main model for training, and an encoder model and decoder model which are bound to that. That then allows you to run partial bits of the network.
func (*Model) Build ¶
Build builds the model, using a specified input and output node. It adds the loss function to the graph, and creates the machine. This should only be called once per model.
func (*Model) CopyParamsFrom ¶ added in v0.2.0
CopyParamsFrom copys the parameters in the model m1 to the parameters in this model m, meaning layers with the same name will share the same values in their tensors. The tensors will be copies of each other, so changing one will not change the other. If you want to share the tensors, use BindParamsFrom instead.
func (*Model) FitBatch ¶
func (m *Model) FitBatch(inputs, lossRequirements map[string]T.Tensor, solver G.Solver) (float64, error)
FitBatch runs the model on a batch of input data, and then trains the model on the target data. The solver used is passed in as an argument. IMPORTANT NOTE: Currently, when the data is batched, the last batch of data will be discarded if the x size does not evenly divide the batch size.
func (*Model) FitGenerator ¶ added in v0.1.0
FitGenerator fits the model to the given data generator.
func (*Model) GetParams ¶
GetParams returns a map of all the parameters in the model. The keys are the layer name and parameter name, separated by a colon (e.g. "model_1:weights")
func (*Model) MustBindParamsFrom ¶ added in v0.3.3
MustBindParamsFrom calls BindParamsFrom, but panics if there is an error.
func (*Model) MustCopyParamsFrom ¶ added in v0.3.3
MustCopyParamsFrom calls CopyParamsFrom, but panics if there is an error.
func (*Model) MustFitBatch ¶ added in v0.3.3
MustFitBatch calls FitBatch, but panics if there is an error.
func (*Model) MustFitGenerator ¶ added in v0.3.3
func (m *Model) MustFitGenerator(tdg TrainingDataGenerator, solver G.Solver, opts ...FitOpt)
MustFitGenerator calls FitGenerator, but panics if there is an error.
func (*Model) MustPredict ¶ added in v0.3.3
MustPredict calls Predict, but panics if there is an error.
func (*Model) MustPredictBatch ¶ added in v0.3.3
MustPredictBatch calls PredictBatch, but panics if there is an error.
func (*Model) MustReadParams ¶ added in v0.3.3
MustReadParams calls ReadParams, but panics if there is an error.
func (*Model) MustSetParams ¶ added in v0.3.3
MustSetParams calls SetParams, but panics if there is an error.
func (*Model) MustWriteParams ¶ added in v0.3.3
MustWriteParams calls WriteParams, but panics if there is an error.
func (*Model) Predict ¶ added in v0.0.5
Predict returns the models outputs for the given inputs. It cuts the inputs into batches so the inputs can be of any length.
func (*Model) PredictBatch ¶
PredictBatch runs the model on a batch of input data. The batch size must match the input node shape.
func (*Model) ReadParams ¶
ReadParams reads the parameters in gob format from an io.Reader. The params are retrieved with Model.GetParams.
func (*Model) SetParams ¶
SetParams sets the parameters in the model, which can be retrieved with Model.GetParams. It will only load parameters with matching names, and will ignore any others. This means you can load parameters from a model with a different architecture, as long as the names match on equivalent layers.
func (*Model) Trainables ¶
Trainables returns a list of all the trainable nodes in the model.
type NamedTs ¶ added in v0.2.0
NamedTs is a map of string to T.Tensor. It is just a convenience type to make code nicer to read.
type OneHotLayer ¶ added in v0.3.0
A OneHotLayer is a layer that performs a one-hot encoding of the input. The input should be a 1D tensor of integers (batchsize,). The output will be a 2D tensor of the specified dtype (batchsize, numClasses).
func (*OneHotLayer) MustAttach ¶ added in v0.3.0
func (l *OneHotLayer) MustAttach(n *G.Node) *G.Node
func (*OneHotLayer) Parameters ¶ added in v0.3.0
func (*OneHotLayer) Parameters() map[string]*G.Node
Parameters implements Layer.
type ReshapeLayer ¶ added in v0.0.4
ReshapeLayer is a reshape layer.
- Input Shape: any shape
- Output Shape: the specified shape [as long as both shapes have the same volume]
func Reshape ¶ added in v0.0.4
func Reshape(model *Model, name string, newShape T.Shape) *ReshapeLayer
Reshape creates a new ReshapeLayer on the Model with the given target shape.
func (*ReshapeLayer) MustAttach ¶ added in v0.0.4
func (l *ReshapeLayer) MustAttach(n *G.Node) *G.Node
MustAttach attaches the ReshapeLayer to the given node. It panics on error.
func (*ReshapeLayer) Parameters ¶ added in v0.0.5
func (l *ReshapeLayer) Parameters() map[string]*G.Node
Parameters returns a map of the parameters of the layer.
type TensorTrainingDataGenerator ¶ added in v0.1.0
type TensorTrainingDataGenerator struct {
// contains filtered or unexported fields
}
TensorTrainingDataGenerator is a TrainingDataGenerator that uses tensors as inputs and outputs. It should only be used with small datasets, as it requires the entire dataset to be loaded into memory at once.
func NewTTDG ¶ added in v0.1.0
func NewTTDG(xs, ys map[string]T.Tensor) *TensorTrainingDataGenerator
NewTTDG creates a new TensorTrainingDataGenerator. This is used by the fit method of the model to generate batches of data. The inputs and outputs are the training data and labels respectively. They are a slice due to multiple input output capabilities. If you only have one input and output, you can pass in a slice of length 1 for both.
func (*TensorTrainingDataGenerator) NumBatches ¶ added in v0.1.0
func (t *TensorTrainingDataGenerator) NumBatches() int
func (*TensorTrainingDataGenerator) Reset ¶ added in v0.1.0
func (t *TensorTrainingDataGenerator) Reset(batchSize int) error
type TrainingDataGenerator ¶ added in v0.1.0
type TrainingDataGenerator interface { // NextBatch returns the next batch of data and labels. If there is no more data, it should return nil, nil, nil. NextBatch() (map[string]T.Tensor, map[string]T.Tensor, error) Reset(batchSize int) error // Resets the generator for the next epoch NumBatches() int // Returns the number of batches in this epoch }
TrainingDataGenerator is used by a model to generate data on-the-fly during training.