Conversation
| rightPad = strRightPad.Atoi(); | ||
| } break; | ||
| } | ||
| ++idxToken; |
There was a problem hiding this comment.
What is a valid string here? Add some information in the comments. ("1, 2, 3 4")? Any caveats the caller should be aware of?
There was a problem hiding this comment.
Sure. I don't have any checks for negative numbers in the padding string. Is the check necessary? I'll add the order of padding to the comments nevertheless
tmva/tmva/src/MethodDL.cxx
Outdated
| Log() << kFATAL << "LSTM Layer is not yet fully implemented" << Endl; | ||
| //ParseLstmLayer(deepNet, nets, layerString->GetString(), subDelimiter); | ||
| } else if (strLayerType == "PADDING") { | ||
| ParseRnnLayer(deepNet, nets, layerString->GetString(), subDelimiter); |
tmva/tmva/inc/TMVA/DNN/DeepNet.h
Outdated
| /*! Function for adding Padding Layer in the Deep Neural Network, with a given | ||
| * top, bottom, left and right paddings. It will take every matrix from the | ||
| * previous layer and pad it with zeros to a matrix with new dimensions. */ | ||
| TPaddingLayer<Architecture_t> *AddPaddingLayer(size_t topPad, size_t bottomPad, size_t leftPad, size_t rightPad); |
There was a problem hiding this comment.
Can we also have a factory method for using "full" and "valid" paddings? (without the user need to specify exact dimensions)
There was a problem hiding this comment.
Should we make this a factory method in the Padding Layer itself or have this option only for conv layer?
| namespace CNN { | ||
|
|
||
| template <typename Architecture_t> | ||
| class TPaddingLayer : public VGeneralLayer<Architecture_t> |
There was a problem hiding this comment.
If I understand correctly this is more appropriately named PaddingLayer2D since it assumes 2D spatial layout. (Then we can also integrate 1D and 3D layers later. This is the approach of e.g. keras).
| fTopPad(topPad), fBottomPad(bottomPad), fLeftPad(leftPad), fRightPad(rightPad) | ||
| { | ||
|
|
||
| this->outputHeight = inputHeight + topPad + bottomPad; |
There was a problem hiding this comment.
Create private functions for this? The calculation can be reused in constructor for options "valid" and "full".
There was a problem hiding this comment.
Yes. I have made this change already. I'll push it soon.
This layer introduces arbitrary padding as a separate layer instead of having it in convolution. The idea is to allow only fixed padding types for convolution layer such as
VALID,SAME,FULLetc. Since arbitrary padding is still required in some architectures, it's been created as a separate layer. The layer takes 4 arguments:The expected format for the string is this:
PADDING2D|topPad|bottomPad|leftPad|rightPadThis contains the naive implementation of padding.
More efficient approaches such as pre-allocation, a single data structure for propagations are yet to be discussed.TO-DO
Tests for padding