-
Notifications
You must be signed in to change notification settings - Fork 28
Description
I have met @sebastien-forestier last week and we have a proposition of architecture for the development of GMR. As it has been said before, it should be possible to use GMR in any case. That is why a GMR class will be created in models and will be herited from sklearn.mixture.GMM like GMM class from gmminf.py. Plus, it will contained regressions methods LSE, SLSE, stochastic sampling and the jonathan sampling method that has been described by @oudeyer inspired from @jgrizou noteboook . The number of gaussians are given in parameters.
NB :
- LSE : moyenne pondérée
- SLSE : consiste à faire de la régression LSE en utilisant seulement une Gaussienne: celle dont la projection sur y a le plus grand « poids ».
- Stochastic sampling : consiste à tirer le x dans la distribution P(x sachant y) encodée par le mélange de Gaussienne.
- Jonathan sampling (I do not know how to name it) : la méthode consistant à trouver le x qui maximise la probabilité P(x sachant y), ce dont parle Jonathan dans son notebook (mais Gharamani n’en parle pas). Pour cela il faut utiliser un alto d’optimisation stochastique,
car pas de solution analytique. Méthode aussi utilisable pour apprendre directement des modèles inverses redondants.
A second class ILOGMR will be implemented and would be herited from SensorimotorModels abstract class . It will be available for inverse and forward model. Plus, users will choose the way they want to compute their inverse model (directly or using optimization).
Do you think this structure is fine ? Do I forget something ?