-
Notifications
You must be signed in to change notification settings - Fork 2
Save and load optimizer state in bilby #5
Copy link
Copy link
Open
Description
Currently, bilby auto-saves parameters during a training run and loads them back in if available on restart. However, it should also save the state of the optimizer. The Adam algorithm uses the concept of momentum from past gradient steps. This is currently lost when a training run is interrupted. It should be possible to serialize and deserialize the optax adam object that represents the state of this algorithm. This should then be autoloaded when bilby is started, if it exists. There should also be an option on the command line to disable this, or to specify the path to the serialized adam state.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels