Part of [coverage lists](https://github.com/mrkn/mxnet.rb/issues/7) Activations - [ ] LeakyReLU - [ ] PReLU - [ ] ELU - [ ] SELU - [ ] GELU - [ ] Swish Layers - [x] Sequential - [x] HybridSequential - [x] Dense - [ ] Dropout - [ ] BatchNorm - [ ] Embedding - [x] Flatten - [ ] InstanceNorm - [ ] LayerNorm - [ ] GroupNorm - [ ] Lambda - [ ] HybridLambda - [x] _Conv - [ ] Conv1D - [x] Conv2D - [ ] Conv3D - [ ] Conv1DTranspose - [ ] Conv2DTranspose - [ ] Conv3DTranspose - [x] _Pooling - [ ] MaxPool1D - [x] MaxPool2D - [ ] MaxPool3D - [ ] AvgPool1D - [ ] AvgPool2D - [ ] AvgPool3D - [ ] GlobalMaxPool1D - [ ] GlobalMaxPool2D - [ ] GlobalMaxPool3D - [ ] GlobalAvgPool1D - [ ] GlobalAvgPool2D - [ ] GlobalAvgPool3D - [ ] ReflectionPad2D RNN - [ ] GRU Cell - [ ] LSTM Cell - [ ] GRU Layer - [ ] LSTM Layer Loss - [x] L2Loss - [x] L1Loss - [ ] SigmoidBinaryCrossEntropyLoss - [x] SoftmaxCrossEntropyLoss - [ ] KLDivLoss - [ ] CTCLoss - [ ] HuberLoss - [ ] HingeLoss - [ ] SquaredHingeLoss - [ ] LogisticLoss - [ ] TripletLoss - [ ] PoissonNLLLoss - [ ] CosineEmbeddingLoss - [ ] SDMLLoss
Part of coverage lists
Activations
Layers
Sequential
HybridSequential
Dense
Dropout
BatchNorm
Embedding
Flatten
InstanceNorm
LayerNorm
GroupNorm
Lambda
HybridLambda
_Conv
Conv1D
Conv2D
Conv3D
Conv1DTranspose
Conv2DTranspose
Conv3DTranspose
_Pooling
MaxPool1D
MaxPool2D
MaxPool3D
AvgPool1D
AvgPool2D
AvgPool3D
GlobalMaxPool1D
GlobalMaxPool2D
GlobalMaxPool3D
GlobalAvgPool1D
GlobalAvgPool2D
GlobalAvgPool3D
ReflectionPad2D
RNN
GRU Cell
LSTM Cell
GRU Layer
LSTM Layer
Loss