lasagne.layers¶
get_output | Computes the output of the network at one or more given layers. |
get_output_shape | Computes the output shape of the network at one or more given layers. |
get_all_layers | This function gathers all layers below one or more given Layer instances, including the given layer(s). |
get_all_params | This function gathers all parameters of all layers below one or more given Layer instances, including the layer(s) itself. |
count_params | This function counts all parameters (i.e., the number of scalar values) of all layers below one or more given Layer instances, including the layer(s) itself. |
get_all_param_values | This function returns the values of the parameters of all layers below one or more given Layer instances, including the layer(s) itself. |
set_all_param_values | Given a list of numpy arrays, this function sets the parameters of all layers below one or more given Layer instances (including the layer(s) itself) to the given values. |
Layer | The Layer class represents a single layer of a neural network. |
MergeLayer | This class represents a layer that aggregates input from multiple layers. |
InputLayer | This layer holds a symbolic variable that represents a network input. |
DenseLayer | A fully connected layer. |
NonlinearityLayer | A layer that just applies a nonlinearity. |
NINLayer | Network-in-network layer. |
Conv1DLayer | 1D convolutional layer |
Conv2DLayer | 2D convolutional layer |
MaxPool1DLayer | 1D max-pooling layer |
MaxPool2DLayer | 2D max-pooling layer |
Pool2DLayer | 2D pooling layer |
GlobalPoolLayer | Global pooling layer |
FeaturePoolLayer | Feature pooling layer |
FeatureWTALayer | ‘Winner Take All’ layer |
CustomRecurrentLayer | A layer which implements a recurrent connection. |
RecurrentLayer | Dense recurrent neural network (RNN) layer |
LSTMLayer | A long short-term memory (LSTM) layer. |
GRULayer | Gated Recurrent Unit (GRU) Layer |
Gate | Simple class to hold the parameters for a gate connection. |
DropoutLayer | Dropout layer |
dropout | alias of DropoutLayer |
GaussianNoiseLayer | Gaussian noise layer. |
ReshapeLayer | A layer reshaping its input tensor to another tensor of the same total number of elements. |
reshape | alias of ReshapeLayer |
FlattenLayer | A layer that flattens its input. |
flatten | alias of FlattenLayer |
DimshuffleLayer | A layer that rearranges the dimension of its input tensor, maintaining the same same total number of elements. |
dimshuffle | alias of DimshuffleLayer |
PadLayer | Pad all dimensions except the first batch_ndim with width zeros on both sides, or with another value specified in val. |
pad | alias of PadLayer |
SliceLayer | Slices the input at a specific axis and at specific indices. |
ConcatLayer | Concatenates multiple inputs along the specified axis. |
concat | alias of ConcatLayer |
ElemwiseMergeLayer | This layer performs an elementwise merge of its input layers. |
ElemwiseSumLayer | This layer performs an elementwise sum of its input layers. |
EmbeddingLayer | A layer for word embeddings. |
corrmm.Conv2DMMLayer | 2D convolutional layer |
cuda_convnet.Conv2DCCLayer | 2D convolutional layer |
cuda_convnet.MaxPool2DCCLayer | 2D max-pooling layer |
cuda_convnet.ShuffleBC01ToC01BLayer | shuffle 4D input from bc01 (batch-size-first) order to c01b |
cuda_convnet.bc01_to_c01b | alias of ShuffleBC01ToC01BLayer |
cuda_convnet.ShuffleC01BToBC01Layer | shuffle 4D input from c01b (batch-size-last) order to bc01 |
cuda_convnet.c01b_to_bc01 | alias of ShuffleC01BToBC01Layer |
cuda_convnet.NINLayer_c01b | Network-in-network layer with c01b axis ordering. |
dnn.Conv2DDNNLayer | 2D convolutional layer |
dnn.MaxPool2DDNNLayer | 2D max-pooling layer |
dnn.Pool2DDNNLayer | 2D pooling layer |