Optimization schemes with complex numbers are widely used in physics, and recently, machine learning.
I strongly suggest to add the support to complex numbers for optimization engines like RmsProp et. al.
We just need a few lines of change and several tests.
E.g. climin/rmsprop.py line 165-167
self.moving_mean_squared = (
self.decay * self.moving_mean_squared
+ (1 - self.decay) * gradient ** 2)
--> + (1 - self.decay) * np.abs(gradient) ** 2)
A single line of change would make it applicable for complex numbers.
The same is true for Adam and Adadelta.
On the other side, GradientDescent works well already without any change.
Maybe a bit effort is needed for Rprop, I have no clue yet how to make it compatible with complex numbers due to the ill defined sign function for complex numbers.