In the current implementation ReLU is called as a function after each convolution layer.
The guided back-propagation tutorial I can find online are applying the hook function when detecting the ReLU function implemented as a module.
I am not sure what would be the right way to modify YuGCN to make this process easier. cc @ltetrel