The network had thirteen layers:
- Convolutional layer: 3 × 3 kernel with 70 filters
- Max pooling layer: stride of two
- Convolutional layer: 3 × 3 kernel with 60 filters
- Max pooling later: stride of two
- Convolutional layer: 3 × 3 kernel with 50 filters
- Max pooling later: stride of two
- Convolutional layer: 3 × 3 kernel with 50 filters
- Upscaling layer: scale factor of two
- Convolutional layer: 3 × 3 kernel with 60 filters
- Upscaling layer: scale factor of two
- Convolutional layer: 3 × 3 kernel with 70 filters
- Upscaling layer: scale factor of two
- Convolutional layer: 3 × 3 kernel with one filter All layers used a rectified linear unit (ReLU) activation function except for the last, which used a sigmoid activation.